The Senate Judiciary Committee is expected next week to mark up two pieces of legislation aimed at protecting children online. The committee on Thursday held over two bills that appeared on the agenda for the first time this year: the Eliminating Abusive and Rampant Neglect of Interactive Technologies Act (Earn It) Act (S-1207) and the Strengthening Transparency and Obligation to Protect Children Suffering from Abuse and Mistreatment Act of 2023 (Stop CSAM Act). Introduced by Sens. Richard Blumenthal, D-Conn., and Lindsey Graham, R-S.C., the Earn It Act removes online platforms’ “blanket immunity” under Section 230 of the Communications Decency Act for violations of law on online child sexual abuse material (CSAM). The committee passed the bill by voice vote in 2022 (see 2202100071). The bill “imposes basic accountability on tech companies that are complicit in the sexual abuse and exploitation of children,” Blumenthal said in a statement. Reps. Ann Wagner, R-Mo., and Sylvia Garcia, D-Texas, introduced companion legislation in the House. Senate Judiciary Committee Chairman Dick Durbin, D-Ill., filed his Stop CSAM Act, which would expand the “federal civil cause of action for child victims to also permit victims of online child sexual exploitation to bring a civil cause of action against tech platforms and app stores” that promote or facilitate exploitation. Durbin told reporters Wednesday he’s “still working” on gaining Republican support for his bill. He said during Thursday’s markup that he’s looking forward to discussions about kids safety legislation in the coming weeks. He’s consulted Graham about the process and thinks the top Judiciary Republican agrees this is a “historic and significant, maybe one of the most important things we do during the course of this year.” The modernization of sentencing on this issue is “long overdue,” said Durbin. The National Center on Sexual Exploitation Wednesday urged Congress to pass the Earn It Act: “Congress never intended for Section 230 to permit and encourage online platforms to do nothing -- or worse -- facilitate the spread of CSAM.” Fight for the Future called the Earn It Act a “censorship and surveillance bill that risks the rights and safety of all who depend on encrypted services while making children less" safe. The bill “could lead to suspicionless scans of every online message, photo, and hosted file,” the Electronic Frontier Foundation said. “In the name of fighting crime, the Earn It Act treats all internet users like we should be in a permanent criminal lineup, under suspicion for child abuse.”
PHILADELPHIA -- Pennsylvania Gov. Josh Shapiro (D) challenged state enforcers Tuesday to collaboratively address privacy and social media issues, speaking at a National Association of Attorneys General meeting. North Carolina AG Josh Stein (D) asked an algorithms panel later for suggestions on what states can do amid a rise of AI chatbots like ChatGPT.
If the U.S. Supreme Court opens online platforms to liability for algorithms through a narrow interpretation of Section 230, it could mean a less consumer-friendly internet and entrench dominant platforms further, tech experts said Tuesday.
Republican states are responsible for an unprecedented wave of free speech violations, not the tech industry or Democrats, House Commerce Committee ranking member Frank Pallone, D-N.J., said during a House Communications Subcommittee hearing Tuesday.
Senate Judiciary Committee members probed Wednesday for ways to update Communications Decency Act Section 230 and hold tech platforms more accountable for the impacts of their algorithms (see 2303030041). Senate Technology Subcommittee ranking Josh Hawley, R-Mo., questioned whether anything in the statutory language of Section 230 supports the “super immunity” that protects platforms from liability when they use algorithms to amplify content and profit. University of Washington law professor Eric Schnapper, who recently argued two cases on behalf of social media victims before the Supreme Court, told Hawley the text separates concepts like merely hosting content from boosting it. But it would help if Congress clarified language in the statute, said Schnapper. Hawley asked Schnapper for a specific legislative recommendation for how to fix platforms’ affirmative content recommendations. Schnapper told him the issue is “too complicated” to offer legislative language on the spot, but he’s happy to work with Hawley’s office on a proposal. The Supreme Court recognizes online content is often promoted, sometimes in a “very addictive way to kids,” said Senate Technology Subcommittee Chairman Richard Blumenthal, D-Conn. Quoting Chief Justice John Roberts from the recent oral argument in Gonzalez v. Google, he said online videos don’t “appear out of thin air. They appear pursuant to the algorithms.” Though Justice Elena Kagan admitted she and her colleagues aren’t internet experts, they understand algorithms play a role, said Blumenthal. There’s rare Judiciary Committee consensus on the need to better protect children online, said Judiciary Chairman Dick Durbin, D-Ill. Congress should do something to “make Section 230 make sense,” he said: Something needs to change so platforms have incentives to protect children. The case law on Section 230 doesn’t provide the necessary remedies “quickly enough or thoroughly enough,” said Blumenthal: The internet is no longer a “neutral conduit.” The common ground on Section 230 “boils down” to Congress giving victims their day in court, which Section 230 has prevented for “too many years,” said Hawley. He said he hopes the Supreme Court will “remedy” some of the issues with Section 230 in the Gonzalez case (see 2302210062).
Platforms shouldn’t be liable for real-world harm just because their algorithms amplify and rank content, said consumer advocates, academics and industry representatives Monday at the State of the Net Conference.
Section 230 should be made less of an applicable defense when platforms actively promote content that results in real-world harm, Senate Technology Subcommittee Chairman Richard Blumenthal, D-Conn., told reporters Thursday.
Sen. Richard Blumenthal, D-Conn., said Wednesday he plans a series of hearings on Communications Decency Act Section 230 with hopes of writing bipartisan legislation potentially dealing with platform liability on amplifying content.
Democrats reintroduced legislation Tuesday to carve out Communications Decency Act Section 230 in hopes of holding social media platforms liable for “enabling cyber-stalking, online harassment, and discrimination.” Reintroduced by Sens. Mark Warner and Tim Kaine, both D-Va.; Mazie Hirono, D-Hawaii; Amy Klobuchar, D-Minn.; and Richard Blumenthal, D-Conn., along with Reps. Kathy Castor, D-Fla., and Mike Levin, D-Calif., the Safeguarding Against Fraud, Exploitation, Threats, Extremism and Consumer Harms (Safe Tech) Act (see 2102050047) would clarify that Section 230 doesn’t apply to ads or paid content, doesn’t bar injunctive relief, doesn’t “interfere” with laws on stalking and cyberstalking, allows lawsuits to be filed when a platform might be liable for wrongful death, and doesn’t bar lawsuits under the Alien Tort Claims Act.
FCC Chairwoman Jessica Rosenworcel agrees content moderation and Section 230 of the Communications Decency Act could be improved, she said during a Q&A at the Knight Foundation Media Forum Thursday: "I think a lot of people would say there must be a way to do better. I'm among them." Section 230 is important and helped the internet grow, but “we might over time want to condition its protections on more transparency, complaint processes, things that make you a good actor,” Rosenworcel said, conceding that creating an alternative to 230 would be difficult. Asked about FCC authority over 230, Rosenworcel condemned the previous administration’s efforts on that as “not particularly well-developed” but also seemed to indicate the agency could be involved in future 230 revisions. After Gonzalez v. Google, “we’re going to have to have some discussions about what changes we might see in Congress or what changes we might see at the FCC, but I don’t think that earlier petition that was filed was it,” she said, referencing a case argued Tuesday at the Supreme Court (see 2302210062). Rosenworcel said the agency has done a lot of “incredible things” with four commissioners, but she hopes it gets a fifth soon. One policy she would tackle with a majority is the FCC’s definition of broadband speeds, she said. “If I have five people we’re gonna up that standard,” she said. “It’s really easy to decry polarization and politicization in any environment in Washington,” she said. “But I think the more interesting thing is to put your head down and see what you can do. History is not interested in your complaints.” Asked about FCC efforts to improve connections for the incarcerated, Rosenworcel touted her recent circulation of an item on prison phone rates. She's “optimistic” about having unanimous support for the item at the agency, she said.