Covering Disruptive Technology Powering Business in The Digital Age

image
Opinion: Big Data can lead to big legal problems for companies
image
June 2, 2016 News

Big Data is revolutionizing how business is done in many ways.

Deluged with an unprecedented amount of information available for analysis, companies in just about every industry are discovering increasingly sophisticated ways to make market observations, predictions and evaluations. Big Data can help companies make decisions ranging from which candidates to hire to which consumers should receive a special promotional offer. As a powerful tool for social good, Big Data can bring new opportunities for advancement to underserved populations, increase productivity and make markets more efficient.

But if it’s not handled with care, Big Data has the potential to turn into a big problem. Increasingly, regulators like the Federal Trade Commission (FTC) are cautioning that the use of Big Data might perpetuate and even amplify societal biases by screening out certain groups from opportunities for employment, credit or other forms of advancement. To achieve the full potential of Big Data, and mitigate the risks, it is important to address the potential for “disparate impact.”

Disparate impact is a well-established legal theory under which companies can be held liable for discrimination for what might seem like neutral business practices, such as methods of screening candidates or consumers. If these practices have a disproportionate adverse impact on individuals based on race, age, gender or other protected characteristics, a company may find itself liable for unlawful discrimination even if it had no idea that its practices were discriminatory. In cases involving disparate impact, plaintiffs do not have to show that a defendant company intended to discriminate — just that its policies or actions had the discriminatory effect of excluding protected classes of people from key opportunities.

As the era of Big Data progresses, companies could expose themselves to discrimination claims if they are not on high alert for Big Data’s potential pitfalls. More than ever, now is the time for companies to adopt a more rigorous and thoughtful approach to data.

Consider a simple hypothetical: Based on internal research showing that employees who live closer to work stay at the company longer, a company formulates a policy to screen potential employees by their zip code. If the effect of the policy disproportionately excludes classes of people based on, say, their race — and if there is not another means to achieve the same goal with a smaller disparate impact — that policy might trigger claims of discrimination.

Making matters more complex, companies have to be increasingly aware of the implications of using data they buy from third parties. A company that buys data to verify the creditworthiness of consumers, for example, might be held liable if it uses the data in a way that has a disparate impact on protected classes of people.

Expanding uses of disparate impact

For decades, disparate-impact theories have been used to challenge policies that excluded classes of people in high-stakes areas such as employment and credit. The Supreme Court embraced the theory for the first time in a 1971 employment case called Griggs v. Duke Power Co., which challenged the company’s requirement that workers pass intelligence tests and have high school diplomas. The court found that the requirement violated Title VII of the Civil Rights Act of 1964 because it effectively excluded African-Americans and there was not a genuine business need for it. In addition, courts have allowed the disparate-impact theory in cases brought under the Americans with Disabilities Act and the Age Discrimination in Employment Act.

The theory is actively litigated today and has been expanding into new areas. Last year, for example, the Supreme Court held that claims using the disparate-impact theory can be brought under the Fair Housing Act.

In recent years, the FTC has brought several actions under the disparate-impact theory to address inequities in the consumer-credit markets. In 2008, for example, the agency challenged the policies of a home-mortgage lender, Gateway Funding Diversified Mortgage Services, which gave its loan officers autonomy to charge applicants discretionary overages. The policy, according to the FTC, had a disparate impact on African-American and Hispanic applicants, who were charged higher overages than whites, in violation of the Federal Trade Commission Act and the Equal Credit Opportunity Act.

The good and bad impact of Big Data

As the amount of data about individuals continues to increase exponentially, and companies continue to find new ways to use that data, regulators suggest that more claims of disparate impact could arise. In a report issued in January, the FTC expressed concerns about how data is collected and used. Specifically, it warned companies to consider the representativeness of their data and the hidden biases in their data sets and algorithms.

Similarly, the White House has also shown concern about Big Data’s use. In a report issued last year on Big Data and its impact on differential pricing — the practice of selling the same product to different customers at different prices — President Barack Obama’s Council of Economic Advisers warned: “Big Data could lead to disparate impacts by providing sellers with more variables to choose from, some of which will be correlated with membership in a protected class.”

Meanwhile, the European Union’s Article 29 Data Protection Working Party has cautioned that Big Data practices raise important social, legal and ethical questions related to the protection of individual rights.

To be sure, government officials also acknowledge the benefits that Big Data can bring. The FTC in its report noted that companies have used data to bring more credit opportunities to low-income people, to make workforces more diverse and provide specialized health care to underserved communities.

And in its report, the Council of Economic Advisers acknowledged that Big Data “provides new tools for detecting problems, both before and perhaps after a discriminatory algorithm is used on real consumers.”

Indeed, in the FTC’s action brought against the mortgage lending company Gateway Funding Diversified Mortgage Services, the agency said the company had failed to “review, monitor, examine or analyze the loan prices, including overages, charged to African-American and Hispanic applicants compared to non-Hispanic white applicants.” In other words, Big Data could have helped the company spot the problem.

Policy balancing act

The policy challenge of Big Data, as many see it, is to root out discriminatory effects without discouraging companies from innovating and finding new and better ways to provide services and make smarter decisions about their business.

Regulators will have to decide which Big Data practices they consider to be harmful. There will inevitably be some gray areas. In its report, the FTC suggested advertising by lenders could be one example. It noted that a credit offer targeted at a specific community that is open to all will not likely trigger violations of the law. But it also observed that advertising campaigns can affect lending patterns, and the Department of Justice in the past has cited a creditor’s advertising choices as evidence of discrimination. As a result, the FTC advised lenders to “proceed with caution.”

This article was originally published on www.marketwatch.com and can be viewed in full

(0)(0)

Archive