Graham: tech companies should ‘earn’ liability shield
Graham said he wants to work with tech giants and others to create a list of “best business practices” for protecting minors online
Changes may be coming to the provision in communications law that limits web platforms, like Facebook and Google, from being sued for user content, if Senate Judiciary Chairman Lindsey Graham has his way.
Following a hearing on protections for children from internet predators before his committee Tuesday, Graham said he wants to hold big tech companies more accountable by making them “earn” liability protections. Those “were given to make sure the industry would flourish, mission accomplished. However, the liability protections now have to be modified so that you earn them,” the South Carolina Republican said.
[Tech vanguard is dodging Pentagon]
Graham said he wants to work with Silicon Valley giants and other expert groups to create a list of “best business practices” surrounding the protection of minors online and periodically report whether technology companies were maintaining them. If not, digital platform companies would no longer be able to rely on Section 230 of the 1996-era law, titled the Communications Decency Act, to protect themselves from possible lawsuits for content they host.
“To me, that’s a combination of letting private sector have input about what they should be doing but making sure they meet that test that they set for themselves,” Graham said.
A new federal agency could be created to oversee the process, he added: “Maybe we create a new body because apparently these agencies are not doing very well.”
He offered no timetable for bringing a bill forward, however.
[Big Tech now squarely in the sights of antitrust forces]
Tuesday’s hearing focused on protecting kids when using using popular digital platforms like YouTube and Snapchat, and covered encryption technology, law enforcement, online pornography and platforms being used to sexually exploit minors.
“I’m confident that they have in their know-how the ability to, within weeks, take care of many of the problems that we’ve spoken about here,” said Christopher McKenna, founder of Protect Young Eyes, an organization focused on keeping kids safe online, who was among the experts to testify. “It’s not until they’re pushed or there’s pressure or there’s reputational damage, or something, that they seem to move.”
The hearing came in the wake of a New York Times report that detailed how YouTube’s recommendation algorithm was automatically suggesting home videos of kids to people watching sexually-themed videos.
[jwp-video n=”1″]
YouTube responded by changing its algorithm and noted that it had already removed about 800,000 videos in the first quarter of 2019 that violated safety rules and disabled comments on videos featuring minors. It did not stop recommending videos featuring kids, however.
Another case was brought up by panel member Tennessee Republican Sen. Marsha Blackburn about a 22-year-old man charged with two counts of child exploitation after coercing at least two underage boys to send him explicit photos while pretending to be a teenage girl on Snapchat. On Monday, Blackburn sent a letter to Snapchat CEO Evan Spiegel about the case and others, asking how Snapchat ensures safety in its platform.
During the hearing, Graham said he wants to bring in the tech giants to testify on what they are doing to ensure their underage users’ safety. Sen. Richard Blumenthal added that the Federal Trade Commission should also be called to testify about its enforcement of a 1998 child protection law, which he said was lacking.
“The best laws in the world are dead letter if they are not enforced,” the Connecticut Democrat said.
The 1998 law, titled the Children’s Online Privacy Protection Act, was intended to protect kids under younger than 13 by requiring websites tailored to children to require parental consent to gather or use any data collected from the young users.
Sen. Josh Hawley last month introduced a bill prohibiting video platforms from recommending videos that featured one or more minors. Platforms that fail to comply would have to pay a maximum of $1,000 for each video that violated the rule or $10,000 per day.
“This report was sickening but what I think was even more sickening was YouTube’s refusal to do anything about it,” the Missouri Republican said. “Why not? Because their model is that 70 percent of their business, 70 percent of their traffic comes from these auto-recommended videos.”
He has also co-sponsored a bill with Democratic Sen. Edward J. Markey, of Massachusetts, to update the Children’s Online Privacy Protection Act by prohibiting digital platforms from collecting personal data and location from anyone under 13 without parental consent and anyone ages 13-to-15 without user consent.
Graham didn’t close the door to extending his plan to the broader set of accountability challenges surrounding digital platforms, but said kid safety would be the focus right now.
“I think that’s a good place to start,” he said. “I’d hate to be the Democrat or Republican that [is] against that one, so let’s start there.”