Friday, January 15, 2016

Jason Atchley : Legal Tech News : FTC Cautions Businesses on Big Data Use

jason atchley

FTC Cautions Businesses on Big Data Use

Companies must proceed with caution as they use consumer surveillance tools made possible in today’s ‘big data’ era.
, Legaltech News
U.S. Federal Trade Commission Building.
U.S. Federal Trade Commission Building.
A new report from the Federal Trade Commission (FTC) reminds business to avoid “exclusionary” or “discriminatory” uses of big data analysis. Listing many sample questions, the new report, “Big Data: A Tool for Inclusion or Exclusion? Understanding the Issues,” looks at how big data is used after being collected and analyzed. The study was released this month and comments are being made in response.
“The FTC has delivered a sweeping review on how today’s data-driven marketplace poses serious risks to consumers,” Jeffrey Chester, executive director of the Center for Digital Democracy, told Legaltech News. He added, “The commission’s message is clear—companies must proceed with caution as they use consumer surveillance tools made possible in today’s ‘big data’ era. Every consumer should be alarmed about the host of little publicly- known practices that can harm our credit, employment and privacy.”
With the new report, the FTC “is bringing a much needed 21st century update on how it will enforce important consumer protection laws,” according to Chester.
Still, he said the FTC should be encouraging Congress to update the Fair Credit Reporting Act (FCRA) and other laws to prevent “high-tech discriminatory profiling…. Even with Congress unlikely to do anything that opposes the powerful business data lobby, the FTC should have acknowledged that it doesn’t have the regulatory clout to really protect consumers.” 
Basically, the study looks at risks that “could result from biases or inaccuracies about certain groups, including more individuals mistakenly denied opportunities based on the actions of others, exposing sensitive information, creating or reinforcing existing disparities, assisting in the targeting of vulnerable consumers for fraud, creating higher prices for goods and services in lower-income communities and weakening the effectiveness of consumer choice,” according to an FTC statement.
The study also includes several questions companies should be asking themselves. Ones on legal compliance include:
  • If you compile big data for others who will use it for eligibility decisions (such as on credit, employment, insurance, housing, government benefits), are you complying with the accuracy and privacy provisions of the FCRA?
  • If you receive big data products from another entity that you will use for eligibility decisions, are you complying with the provisions applicable to users of consumer reports?
  • If you are a creditor using big data analytics in a credit transaction, are you complying with the requirement to provide statements of specific reasons for adverse action under the Equal Credit Opportunity Act?
  • If you use big data analytics in a way that might adversely affect people in their ability to obtain credit, housing, or employment, are you treating people differently based on a prohibited basis, such as race or national origin?
  • Do your policies, practices or decisions have an adverse effect or impact on a member of a protected class, and if they do, are they justified by a legitimate business need that cannot reasonably be achieved by means that are less disparate in their impact?
  • Are you maintaining reasonable security over consumer data?
  • Are you undertaking reasonable measures to know the purposes for which your customers are using your data?
  • If you know that your customer will use your big data products to commit fraud, do not sell your products to that customer. If you have reason to believe that your data will be used to commit fraud, ask more specific questions about how your data will be used.
  • If you know that your customer will use your big data products for discriminatory purposes, do not sell your products to that customer.
In the report, the FTC also predicted that big data “will continue to grow in importance, and it is undoubtedly improving the lives of underserved communities in areas such as education, health, local and state services, and employment.”
“The Commission will continue to monitor areas where big data practices could violate existing laws, including the FTC Act, the FCRA, and ECOA, and will bring enforcement actions where appropriate,” the FTC added. “In addition, the Commission will continue to examine and raise awareness about big data practices that could have a detrimental impact on low-income and underserved populations and promote the use of big data that has a positive impact on such populations. Given that big data analytics can have big consequences, it is imperative that we work together—government, academics, consumer advocates, and industry—to help ensure that we maximize big data’s capacity for good while identifying and minimizing the risks it presents.”
Furthermore, the FTC asked businesses if their data model account for biases. “Companies should therefore think carefully about how the data sets and the algorithms they use have been generated. Indeed, if they identify potential biases in the creation of these data sets or the algorithms, companies should develop strategies to overcome them,” the FTC said.
In one instance, Google changed its interview and hiring process to ask more “behavioral” questions and focus less on academic grades after discovering that replicating its existing definitions of a “good employee” was resulting in a “homogeneous” tech workforce, the FTC noted.

Read more:

No comments:

Post a Comment