Part of the European Union’s sweeping new privacy regulations could change how people apply for loans, credit cards, and even jobs online.
The new law, called the General Data Protection Regulation, or GDPR, goes into effect Friday and gives EU residents new rights and greater control over their information online, even the ability to have it completely erased. But GDPR goes beyond just protecting personal data and could radically affect how people interact with the algorithms that control so much of the internet.
Algorithms increasingly power everything people do and see online, from what Instagram stories display in their feeds to what ads they see on Facebook. The way companies make those decisions are nearly invisible to them — but they’re calculated from highly specific data about how people use an app or service.
Starting Friday, certain provisions under GDPR give people a new “right to explanation,” as legal experts call it, to avoid a Kafkaesque future where people can’t question the decisions made by artificial intelligence that affect everyday life.
GDPR articles 13, 14, 15, and 22, for example, refer to instances of automated decision-making that produce “legal” or “similarly significant” effects. Those cases could include a credit application or online job recruiting software, where people may have the right to “meaningful information about the logic involved,” according to the law, in a decision made entirely by an algorithm.
“The meat of this whole point is the fear that a lot of people have that automated decision-making will be unaccountable,” said Andrew Selbst, a civil rights attorney and researcher with the Yale Information Society Project. “In general, in law, what is unexplainable is therefore unaccountable.”
What gets an explanation?
In recent years, people have accused companies’ algorithms of being sexist, and even racist. For example, a 2016 ProPublica investigation found crime-prediction software used across the U.S. assigns African-Americans a higher “risk score” than white defendants. And researchers at Carnegie Mellon studied Google job ads and concluded that women were shown fewer higher-paying positions than men were.
But without insight into how those algorithmic decisions were made, people had no legal recourse — until now.
By giving internet users the right to see how these decisions were made and contest them, GDPR could prove a powerful tool for transparency. The law specifically mentions automated decisions on job and credit applications, and in theory, could apply to any algorithm that has a “significant” effect on a user’s life.
But GDPR doesn’t define what “significant” means — beyond limited examples such as credit applications or e-recruiting — which leaves the responsibility to users to prove the effect.
“You would need to make an argument that this ad is affecting you in a significant way,” said Sandra Wachter, a data ethicist at Oxford University, about Facebook content. “So, for example, if job ads are being shown to you, this could actually be affecting your life. The fact that one pair of shoes rather than another pair of shoes is shown to you, I'm not quite sure if that's something that significantly affects users.”
Even if a user successfully proves an algorithmic decision significantly affected them under the law, the information companies would release to them still remains unclear. But ultimately, people could use the data to challenge decisions and force companies to use “human intervention,” which might offer a more favorable outcome in their credit card application, or whatever the situation might be.
The penalties for violating the law are also steep: Rule-breakers face fines of up to 20 million euros or 4 percent of a company’s annual global revenue — whichever is higher.
While no one’s sure what the definition of “significance” looks like, legal experts all agree: Because of the law’s open-endedness, the right to an explanation won’t be clarified until it reaches European courts.
“I think one of the big controversies and what is going to have to be fleshed out through interpretations and court decisions is what ‘legal and similarly significant effects’ mean,” Selbst said.
The types of algorithms potentially affected by GDPR have enabled tech companies to make billions of dollars. The more companies can fine-tune an algorithm to make people use its service longer, or offer an ad they’re more likely to click on, the more money they can make.
That’s why tech companies have always refused to disclose the inner workings of their algorithms, which they consider trade secrets. But with GDPR, they might no longer have a choice.
“It will very much depend on how the public conscience and the public desire grows to get explanations for decisions that are being made by algorithms,” Wachter said.
William Turton contributed to this report.