Privacy Spotlight: FTC Big Data Event
Big Data and its potential for inclusion and exclusion was on center stage this past September as the FTC held a day-long workshop of panelists of experts from industry, technology, privacy, civil liberties, and academia. World Privacy Forum’s Executive Director Pam Dixon, a panelist at the event, spoke about Big Data and privacy, emphasizing several key points, including:
- The need for statistical parity. Statistical parity is fairness in how people are treated in analytics, including predictive analytics. Dixon/Gellman discussed statistical parity at length in their 2014 report, The Scoring of America. On the panel, Dixon referenced the Scoring report, and made several points about statistical parity, noting that the factors that are used in the analytics must be fair, and how well the analytics operate in terms of delivering accurate, well-fit results must also be fair. The use of analytics also needs to be subject to fairness structions.
- The need to keep the Fair Informaton Practice principles alive and well, in addition to laws such as the Equal Credit Opportunity Act, the Civil Rights Act, the Fair Credit Reporting Act, and others.
- Understanding that the very act of categorizing individuals triggers a data paradox: the classification of someone in a group means that classification can be used for benefit or harm, and in some cases, simultaneously.
- Big data is evolving rapidly, and no actual legislative definition exists yet. Recognizing that big data is an immature policy area undergoing rapid change is important as policymakers and experts seek solutions to the challenges it presents.
Media and writers attending the event captured the flavor of the dialogue. Some highlights of the conversation are below.
Note: The full webcast of the FTC Big Data event is available here.
On keeping existing regulation:
“The U.S. shouldn’t abandon long-held ideas that individuals should have control over their personal data, said Pamela Dixon, executive director of the World Privacy Forum.
“I don’t think the [longtime] structures need to be reinvented or shoved aside because data sets are larger,” she said. “It’s important to keep the regulations that we have … to ensure that fair information practices are still applicable and relevant.”
Dixon and Danah Boyd, a principal researcher at Microsoft Research, both noted that the big data industry is in many ways in its infancy. Many companies using big data are still figuring out how to use the information responsibly, Boyd said.
—TechWorld, 15 September 2014, by Grant Gross.
On defining big data:
It could be a group of five or a group of 50, but if you asked a group of experts for a definition of big data, you’d be hard pressed to get a clear-cut answer. What’s clear is that whatever big data is, lines need to be drawn that shape how it impacts the public, industry and government.
How to measure this impact was the basis for the Federal Trade Commission’s big data workshop, which brought business leaders, academics and consumer advocates together Monday to discuss whether big data is helping or harming consumers.
Pamela Dixon, the founder and executive director of the World Privacy Forum says she could find examples of big data both offering help and causing harm, but it’s difficult to [build] policies off either side of the argument due to a lack of understanding as to what big data actually is.
“Big data is immature,” Dixon says. “There is no firm, scalpel-like, definition of big data. Show me an actual legislative definition of it. I know you can’t, because there isn’t one. So what do we do with that? We can’t just throw out the existing fairness structures. We need to use the existing fairness structures that we have.”
FTC commissioner Julie Brill spoke about how those current fairness structures — particularly the Fair Credit Reporting Act — should serve as benchmarks for new regulations aimed at companies that are creating alternative credit scores out of the data they are collecting.
“The use of new sources of information, including information that goes beyond traditional credit files, to score consumers raises fresh questions about whether these alternate sco may have disparate impacts along racial, ethnic or other lines that the law protects,” Brill said. “Those questions are likely to linger and grow more urgent…until the companies develop these alternate scores go further to demonstrate that their models do not contain racial, ethnic, or other prohibited biases.”
— FedScoop, 15 September 2014, by Greg Otto.
On repurposing existing frameworks:
The panelists generally agreed new legislation seeking to implement consumer protections from big data was not the correct answer. Rather, many panelists advanced the position that localized solutions would be sufficient to protect consumers from harms associated with big data. For example, Nicol Turner-Lee posited a general framework like the Fair Information Practice Principles (“FIPPs”) would adequately prevent predatory behavior and possible civil rights infractions. The question would then become, if one were to use the Fair Credit Reporting Act (“FCRA”) or FIPPs as the template, when would it be applied and how nuanced would the application be to the particular data set?
With respect to the repurposing of existing frameworks, Pamela Dixon, Founder and Executive Director of the World Privacy Forum, advanced the application of the Common Rule. This rule was built on the Belmont Report, which is based on the Nuremburg Code to prevent human research atrocities. The bedrock principle of the Common Rule is human consent, which has appealed to humanity across the decades. Ms. Dixon remarked that where violations of the Common Rule have occurred, society has viewed the resultant harm as categorically unfair.
—Lexology, 26 September 2014, Insights from the FTCs “big data” workshop, Steven A. Augustino, John J. Heitmann, Alysa Zeltzer Hutnik, Dana B. Rosenfeld, Christopher M. Loeffler and Sharon Kim Schiavetti.
On next steps and more on definition of big data:
Despite vigorous debate and research about big data, “we may not know what the best course should be,” said FTC Chairman Edith Ramirez during opening remarks. “There is no clear path for navigating the use of big data in a way that advances opportunities for all consumers while diminishing the risks of adverse differential impact on vulnerable populations.” Panelists and presenters echoed Ramirez’s sentiment. “Big data is immature,” said Pam Dixon, executive director of the World Privacy Forum. “There is no firm scalpel-like legislative definition of big data.”
—Washington Internet Daily, 16 September 2014.
On bias and issues relating to categorizing consumers:
If you are using big data to categorize customers or consumers, pay special effort to eliminate biases and unfair business practices. Use existing fairness laws to guide your work. But be prepared for new laws to arrive soon, too.
“Even if most businesses have yet to discover a way to harness all of the data they collect, Pamela Dixon, the founder and executive director of the World Privacy Forum, argued that the collection alone is enough to create bias and policy is needed to protect the public from the negative impact that bias could have in their daily life,” continued the FedScoop post.
“The moment a person is put into a category or is classified, that triggers a data paradox,” Dixon said. “The bottom line is when you classify an individual you trigger this and when that is triggered, we have to do something about it.”
—FierceBigData, 17 September 17, 2014, by Pam Baker.
The event was webcast. For webcast, and more information visit the FTC event page.