Corrected Oct. 18 | While artificial intelligence appears to be a shiny new bauble full of promises and perils, lawmakers in both parties acknowledge that they must first resolve a less trendy but more fundamental problem: data privacy and protection.
With dozens of hearings on data privacy held in the past five years, lawmakers in both chambers have proposed several bills, but Congress has enacted no federal standard as dickering over state-preemption has stymied any advances.
“I agree that data privacy is going to be the foundation of that trust that everyone believes” is essential to widespread adoption of AI systems, Sen. John Hickenlooper, D-Colo., said in an interview. “We have got to know who owns what data and that the use of data is not harmful.”
Hickenlooper, chairman of the Senate Commerce Subcommittee on Consumer Protection, Product Safety, and Data Security, said full committee Chair Maria Cantwell, D-Wash., “feels the same commitment to making sure that at some point, perhaps not this year, but with a sense of urgency, we get a data privacy bill because that is going to underpin so much of what’s going to happen in terms of creating trust in AI.”
A spokeswoman for the Senate Commerce Committee didn’t respond to questions about Cantwell’s plans.
Rep. Cathy McMorris Rodgers, R-Wash., who chairs the House Energy and Commerce Committee, echoed those sentiments when she addressed the fifth annual Data Privacy Conference USA in Washington, D.C., last week.
“As CEOs travel to Washington, D.C., to discuss issues like artificial intelligence, I worry lawmakers might lose focus on what should be the foundation for any AI efforts, which is establishing comprehensive protections on the collection, transfer and the storage of our data,” Rodgers told the gathering.
Sean Kelly, a spokesman for Rodgers, said she’s working with lawmakers to stress the importance of data privacy.
“The AI Summit focused on controls, and even licensing, but didn’t adequately cover how Americans’ personal data is being fed into AI applications without their knowledge,” Kelly said in an email, referring to the recent gathering of AI tech executives organized by Senate Majority Leader Charles E. Schumer, D-N.Y. “Comprehensive privacy legislation would establish the foundation necessary to give Americans control over how their information is collected, used, and shared, whether that’s for AI or for something else.”
Rodgers advocates legislation that would impose data minimization provisions that would “prohibit Big Tech from collecting your data without limit so they can feed it to their AI algorithms,” Kelly said.
Generative AI technologies are trained on extremely large quantities of data scoured from the internet, including text, images, video and audio generated by users as well as public and private institutions. The AI models then learn the patterns in such data, enabling them to generate new content in response to queries that match the patterns observed in the training data.
Last year, the House Energy and Commerce Committee approved bipartisan legislation backed by Rodgers and New Jersey Rep. Frank Pallone Jr., the committee’s top Democrat, that would create a federal data privacy standard. But the measure did not get a floor vote because then-Speaker Nancy Pelosi, D-Calif., opposed the bill on the grounds that it would provide fewer protections than California’s data privacy law.
The Senate didn’t take up a companion measure.
In the absence of a national standard, 11 states, led by California, have passed data privacy legislation and another five states are considering similar measures, creating an uneven patchwork of laws.
Although the top tech companies thwarted an attempt to craft a federal privacy bill during the Obama administration in 2015, since then tech groups such as the Computer and Communications Industry Association, NetChoice and others have routinely called on Congress to enact privacy legislation.
Assessing the EU response
The change in the industry’s stance has been fueled partly by the European Union’s data privacy legislation known as the General Data Protection Regulation, or GDPR, which came into force in 2018 and applies to all tech companies that operate in Europe. Companies violating the provisions on collecting and processing people’s information without their consent can face fines of as much as 4 percent of their global revenue.
GDPR has plenty of critics on both sides of the Atlantic, including Rodgers, who say an identical measure in the United States would hurt small businesses that may be unable to navigate legal requirements. European critics say the law has been unevenly enforced.
Although the GDPR is applicable to all entities that collect and process personal data in the European Union, the law is enforced by national authorities of the 27-member bloc.
“Implementation I think has been a bit irregular,” said Diego Naranjo, the head of policy at European Digital Rights, a Brussels-based nonprofit group that advocates data privacy rights.
Ireland’s Data Protection Authority is one of the enforcers of GDPR because top tech companies including Alphabet Inc., the parent of Google; Apple; Meta Platforms Inc., the parent of Facebook and Instagram; and Microsoft Corp. have their European headquarters in Dublin.
The Irish agency’s enforcement has been “underwhelming,” Naranjo said in an interview in Brussels, adding that the process takes too long, leaving consumer complaints unattended for long periods or legal challenges languishing in Irish courts, he said.
Helen Dixon, Ireland’s commissioner for data protection, said the GDPR’s scope is so large and enforcement continues to be a “work in progress.”
“It’s a general-purpose law that regulates absolutely all forms of personal data processing, which is as ubiquitous as breathing,” Dixon said in an interview in Brussels this summer. “So the GDPR is trying to do an awful lot of heavy lifting in a lot of areas,” but at its core it’s about “protecting fundamental rights” of European citizens.
In the past five years, however, Dixon’s office has imposed eye-popping fines on U.S. tech giants, including a total of about $2.1 billion on Meta and its subsidiaries. The company has challenged some of those, and the cases are making their way through courts. Amazon.com Inc., Google, Qualcomm Inc., TikTok, and other tech giants have been fined by other European agencies for GDPR violations.
The GDPR applies not only to large platforms but also to all manner of data collection. That’s raising fears among Europeans that they could be committing a “GDPR crime if they made a mistake” by taking a photograph of a sporting event, for example, without the consent of the players, Dixon said, citing studies that have identified such concerns.
The GDPR doesn’t outright prohibit data collection but requires that its use be transparent, said Joaquin Perez Catalan, director of the international affairs division at the Spanish Data Protection Authority in Madrid.
Whether companies collect and use data directly from consumers or through third-party entities, “they have to inform you and let you know ‘I’m going to use this data for this purpose’ or ‘I’m going to disclose your data for this purpose’ and obtain the consent of users,” Perez Catalan said in an interview in Madrid. “Because your data is yours.”
Contrary to some critics who say tough regulations hurt innovation, Perez Catalan said sometimes rules help steer companies toward tech improvements that enhance people’s privacy.
He cited the example of Shazam, an app owned by Apple that uses iPhone’s microphone to listen and identify a song playing in a bar or a restaurant. Although the app may inadvertently record the conversations of other people without their consent, the app is so designed that it isolates the music from people’s voices and only sends out the elements of music from the phone to an outside server to identify a song, therefore preserving individuals’ privacy, he said.
Despite all the criticism of GDPR’s effectiveness, “to EU’s credit, while the law was far from perfect, there was more right than wrong about that,” said Rep. Jay Obernolte, R-Calif. “That was something that the EU was very early on that they ended up being correct” and states such as California borrowed ideas from GDPR, Obernolte said in an interview.
Note: This is the fourth in a series of stories examining the European Union’s regulations on technology and how it contrasts with approaches being pursued in the United States. Reporting for this series was made possible in part through a trans-Atlantic media fellowship from the Heinrich Boell Foundation, Washington D.C.
This report was corrected to remove Qualcomm, Inc. from the list of companies that have been fined by European agencies for GDPR violations.