Big Data is revolutionizing how business is done in many ways.
Deluged with an unprecedented amount of information available for analysis, companies in just about every industry are discovering increasingly sophisticated ways to make market observations, predictions and evaluations. Big Data can help companies make decisions ranging from which candidates to hire to which consumers should receive a special promotional offer. As a powerful tool for social good, Big Data can bring new opportunities for advancement to underserved populations, increase productivity and make markets more efficient.
But if it’s not handled with care, Big Data has the potential to turn into a big problem. Increasingly, regulators like the Federal Trade Commission (FTC) are cautioning that the use of Big Data might perpetuate and even amplify societal biases by screening out certain groups from opportunities for employment, credit or other forms of advancement. To achieve the full potential of Big Data, and mitigate the risks, it is important to address the potential for “disparate impact.”
Disparate impact is a well-established legal theory under which companies can be held liable for discrimination for what might seem like neutral business practices, such as methods of screening candidates or consumers. If these practices have a disproportionate adverse impact on individuals based on race, age, gender or other protected characteristics, a company may find itself liable for unlawful discrimination even if it had no idea that its practices were discriminatory. In cases involving disparate impact, plaintiffs do not have to show that a defendant company intended to discriminate — just that its policies or actions had the discriminatory effect of excluding protected classes of people from key opportunities.
As the era of Big Data progresses, companies could expose themselves to discrimination claims if they are not on high alert for Big Data’s potential pitfalls. More than ever, now is the time for companies to adopt a more rigorous and thoughtful approach to data.
Consider a simple hypothetical: Based on internal research showing that employees who live closer to work stay at the company longer, a company formulates a policy to screen potential employees by their zip code. If the effect of the policy disproportionately excludes classes of people based on, say, their race — and if there is not another means to achieve the same goal with a smaller disparate impact — that policy might trigger claims of discrimination.
Making matters more complex, companies have to be increasingly aware of the implications of using data they buy from third parties. A company that buys data to verify the creditworthiness of consumers, for example, might be held liable if it uses the data in a way that has a disparate impact on protected classes of people.
Expanding uses of disparate impact
For decades, disparate-impact theories have been used to challenge policies that excluded classes of people in high-stakes areas such as employment and credit. The Supreme Court embraced the theory for the first time in a 1971 employment case called Griggs v. Duke Power Co., which challenged the company’s requirement that workers pass intelligence tests and have high school diplomas. The court found that the requirement violated Title VII of the Civil Rights Act of 1964 because it effectively excluded African-Americans and there was not a genuine business need for it. In addition, courts have allowed the disparate-impact theory in cases brought under the Americans with Disabilities Act and the Age Discrimination in Employment Act.
The theory is actively litigated today and has been expanding into new areas. Last year, for example, the Supreme Court held that claims using the disparate-impact theory can be brought under the Fair Housing Act.
In recent years, the FTC has brought several actions under the disparate-impact theory to address inequities in the consumer-credit markets. In 2008, for example, the agency challenged the policies of a home-mortgage lender, Gateway Funding Diversified Mortgage Services, which gave its loan officers autonomy to charge applicants discretionary overages. The policy, according to the FTC, had a disparate impact on African-American and Hispanic applicants, who were charged higher overages than whites, in violation of the Federal Trade Commission Act and the Equal Credit Opportunity Act.
The good and bad impact of Big Data
As the amount of data about individuals continues to increase exponentially, and companies continue to find new ways to use that data, regulators suggest that more claims of disparate impact could arise. In a report issued in January, the FTC expressed concerns about how data is collected and used. Specifically, it warned companies to consider the representativeness of their data and the hidden biases in their data sets and algorithms.
Similarly, the White House has also shown concern about Big Data’s use. In a report issued last year on Big Data and its impact on differential pricing — the practice of selling the same product to different customers at different prices — President Barack Obama’s Council of Economic Advisers warned: “Big Data could lead to disparate impacts by providing sellers with more variables to choose from, some of which will be correlated with membership in a protected class.”
Meanwhile, the European Union’s Article 29 Data Protection Working Party has cautioned that Big Data practices raise important social, legal and ethical questions related to the protection of individual rights.
To be sure, government officials also acknowledge the benefits that Big Data can bring. The FTC in its report noted that companies have used data to bring more credit opportunities to low-income people, to make workforces more diverse and provide specialized health care to underserved communities.
And in its report, the Council of Economic Advisers acknowledged that Big Data “provides new tools for detecting problems, both before and perhaps after a discriminatory algorithm is used on real consumers.”
Indeed, in the FTC’s action brought against the mortgage lending company Gateway Funding Diversified Mortgage Services, the agency said the company had failed to “review, monitor, examine or analyze the loan prices, including overages, charged to African-American and Hispanic applicants compared to non-Hispanic white applicants.” In other words, Big Data could have helped the company spot the problem.
Policy balancing act
The policy challenge of Big Data, as many see it, is to root out discriminatory effects without discouraging companies from innovating and finding new and better ways to provide services and make smarter decisions about their business.
Regulators will have to decide which Big Data practices they consider to be harmful. There will inevitably be some gray areas. In its report, the FTC suggested advertising by lenders could be one example. It noted that a credit offer targeted at a specific community that is open to all will not likely trigger violations of the law. But it also observed that advertising campaigns can affect lending patterns, and the Department of Justice in the past has cited a creditor’s advertising choices as evidence of discrimination. As a result, the FTC advised lenders to “proceed with caution.”
This article was originally published on www.marketwatch.com and can be viewed in full
Archive
- October 2024(44)
- September 2024(94)
- August 2024(100)
- July 2024(99)
- June 2024(126)
- May 2024(155)
- April 2024(123)
- March 2024(112)
- February 2024(109)
- January 2024(95)
- December 2023(56)
- November 2023(86)
- October 2023(97)
- September 2023(89)
- August 2023(101)
- July 2023(104)
- June 2023(113)
- May 2023(103)
- April 2023(93)
- March 2023(129)
- February 2023(77)
- January 2023(91)
- December 2022(90)
- November 2022(125)
- October 2022(117)
- September 2022(137)
- August 2022(119)
- July 2022(99)
- June 2022(128)
- May 2022(112)
- April 2022(108)
- March 2022(121)
- February 2022(93)
- January 2022(110)
- December 2021(92)
- November 2021(107)
- October 2021(101)
- September 2021(81)
- August 2021(74)
- July 2021(78)
- June 2021(92)
- May 2021(67)
- April 2021(79)
- March 2021(79)
- February 2021(58)
- January 2021(55)
- December 2020(56)
- November 2020(59)
- October 2020(78)
- September 2020(72)
- August 2020(64)
- July 2020(71)
- June 2020(74)
- May 2020(50)
- April 2020(71)
- March 2020(71)
- February 2020(58)
- January 2020(62)
- December 2019(57)
- November 2019(64)
- October 2019(25)
- September 2019(24)
- August 2019(14)
- July 2019(23)
- June 2019(54)
- May 2019(82)
- April 2019(76)
- March 2019(71)
- February 2019(67)
- January 2019(75)
- December 2018(44)
- November 2018(47)
- October 2018(74)
- September 2018(54)
- August 2018(61)
- July 2018(72)
- June 2018(62)
- May 2018(62)
- April 2018(73)
- March 2018(76)
- February 2018(8)
- January 2018(7)
- December 2017(6)
- November 2017(8)
- October 2017(3)
- September 2017(4)
- August 2017(4)
- July 2017(2)
- June 2017(5)
- May 2017(6)
- April 2017(11)
- March 2017(8)
- February 2017(16)
- January 2017(10)
- December 2016(12)
- November 2016(20)
- October 2016(7)
- September 2016(102)
- August 2016(168)
- July 2016(141)
- June 2016(149)
- May 2016(117)
- April 2016(59)
- March 2016(85)
- February 2016(153)
- December 2015(150)