A federal court in California should "use its broad power” to end Google’s illegal Play Store monopoly, which has resulted in anticompetitive distribution of video games on Android devices, the FTC said in a filing Monday (docket 3:20-cv-05671-JD). Epic Games sued Google in 2020, challenging the platform’s 30% app store commission fee for downloads of Epic’s Fortnite. A federal jury in December 2023 found Google guilty of maintaining an illegal monopoly, siding with Epic on all counts. Google is appealing. Epic in March asked U.S. District Judge James Donato in San Francisco to force Google to remove barriers to app store competition on Android devices. The U.S. District Court for the Northern District of California should “use its broad power to order a remedy that stops the illegal conduct, prevents its recurrence, and restores competition,” the FTC said in its amicus brief. Injunctive relief should “restore lost competition in a forward-looking way and should ensure a monopolist is not continuing to reap the advantages and benefits obtained through the antitrust violation,” the agency said. The FTC asked the court to consider structural relief and potential remedies that “address unlawfully acquired scale or unlawfully erected entry barriers, be it in the context of a single product or across lines of business.” The commission voted 3-0 to file the brief, with Commissioners Melissa Holyoak and Andrew Ferguson recused. Holyoak was recused due to her work on behalf of Utah in the state’s antitrust case against Google. Ferguson was recused owing to his role as solicitor general of Virginia, which participated in Epic v. Google.
X agreed that it will stop processing Europeans' personal data to train its AI tool Grok while EU privacy watchdogs assess whether the practice complies with the general data protection regulation, the Irish Data Protection Commission announced Thursday. The company said it will suspend processing personal data from public posts dated May 7 to Aug. 1. X's capitulation came after the DPC filed an "urgent" application at Ireland's High Court. The application aims at protecting the rights of X's EU/European Economic Area users and "came after extensive engagement between the DPC and X," the regulator said.
Companies like Meta intentionally target children and must be held more accountable for social media-related harm, attorneys general from New Mexico and Virginia said Wednesday. New Mexico AG Raul Torrez (D) and Virginia AG Jason Miyares (R) discussed potential solutions to online child exploitation during the Coalition to End Sexual Exploitation Global Summit that the National Center on Sexual Exploitation and Phase Alliance hosted. Torrez said the tech industry received an “extraordinary grant” through Communications Decency Act Section 230, which Congress passed in 1996 to promote internet innovation. Section 230 has been a hurdle to holding companies accountable, even when they knowingly host illegal activity that’s harmful to children, Torrez added. Miyares said AGs won't wait for legislators in Washington to solve the problem, noting state enforcers' success in the courts. Tech companies shouldn’t be able to use Section 230 as a shield from liability while also acting as publishers and removing political content they disfavor, Miyares added. Torrez acknowledged he and Miyares disagree on many things, but they agree on the need to increase liability and accountability of tech platforms when it comes to children.
TikTok's commitment to permanently shutter its Lite rewards program is now binding under the Digital Services Act (DSA), the European Commission said Monday. TikTok was designated as a very large online platform under the law, and in February the EC opened a noncompliance case against the Chinese company that owns the platform. The issue was whether TikTok breached EU law in protecting minors, advertising transparency, data access for researchers and risk management of addictive design and harmful content. It then launched a second case to determine whether the company violated the DSA when it launched TikTok Lite in France and Spain in March (see 2404220024). The issue with the rewards scheme, the EC said, was that it allowed users to earn points while performing "tasks" such as watching videos, liking content and inviting friends to join the platform. The EC was concerned that the company failed to assess adequately the risks the scheme posed, such as the potential for addictive behavior. It particularly worried about the program's impact on children. That second case has ended with a "commitment decision" but without finding a breach and with no fine, EC officials said at a briefing. TikTok agreed to withdraw the scheme permanently from the EU and not launch another program that would circumvent the withdrawal. Now that the commitments are binding, any breach would mean a violation of the DSA and could spur fines, the EC said. The earlier investigation remains open.
Forcing ByteDance to divest TikTok or face a U.S. ban is a legitimate response when facing a national security threat and doesn’t violate free speech rights, former FCC Chairman Ajit Pai and Thomas Feddo, former Committee on Foreign Investment in the U.S. chairman, argued in an amicus brief filed Friday (see 2406280020). President Joe Biden signed the TikTok divestment measure as part of Congress’ foreign aid package in April (see 2404240060). TikTok and ByteDance are challenging the law's constitutionality. The U.S. Court of Appeals for the District of Columbia scheduled oral argument for Sept. 16 (docket 24-1113). The divestiture policies are “nothing new or extraordinary,” Pai and Feddo wrote. Congress has exercised such power frequently in recent years, especially against Chinese telecom companies like Huawei and ZTE, whose threat to U.S. citizens is “endemic," they said: It’s “ludicrous to suggest, as TikTok does, that the U.S. Government cannot prefer divestiture as a policy option, or that it must wait for Americans to be compromised before it can act." The Biden and Trump administrations see TikTok as a national security threat in view of its mass collection of data and its vulnerability to Chinese surveillance, they said. The new law doesn’t discriminate against individual speakers or content on TikTok and doesn’t regulate speech, they said: It “targets ByteDance’s conduct and is based on the government’s longstanding concerns about that conduct. The Act fits comfortably alongside the existing regulatory structures ... that similarly aim to tackle evolving national security risk.”
Vermont’s lawsuit alleging Meta designed Instagram with the intention of addicting young users can proceed, a superior court judge ruled last week (docket 23-CV-4453). Superior Court Judge Helen Toor denied Meta’s motion to dismiss, saying the company’s First Amendment and Communications Decency Act Section 230 arguments didn't persuade her. Vermont alleges Meta violated the Vermont Consumer Protection Act by intentionally seeking to addict young users through methods it knows are harmful to mental and physical health. The company misrepresented its intentions and the harm it’s “knowingly causing,” the state argued. Vermont is seeking injunctive relief and civil damages. In Meta's request for dismissal, it argued the state lacks jurisdiction, the First Amendment and Section 230 bar the claims, and state enforcers failed to offer a valid claim under state law. The court heard oral argument July 3. The state noted more than 40,000 Vermont teens use Instagram and about 30,000 do so daily. The company uses targeted advertising and other features to maximize the amount of time teens spend on the app. Toor said the First Amendment protects companies' speech, but it doesn’t protect against allegations that a company is manipulating younger users. She noted Section 230 protects a company against liability for hosting third-party content, but it doesn’t shield from liability when a company engages in illegal conduct. Vermont isn’t seeking to hold Meta liable for content it hosts, she said: “Instead, it seeks to hold the company liable for intentionally leading Young Users to spend too much time on-line. Whether they are watching porn or puppies, the claim is that they are harmed by the time spent, not by what they are seeing.” Attorney General Charity Clark filed the lawsuit in October.
TikTok “flagrantly” violated children’s privacy law when it let kids open accounts without parental consent and collected their data, DOJ and the FTC alleged Friday in a lawsuit against the Chinese-owned social media app. TikTok violated the Children’s Online Privacy Protection Act (COPPA) when it knowingly allowed children younger than 13 to maintain accounts, DOJ said in a complaint filed on behalf of the FTC. The company purposefully avoided obtaining parental consent and delivered targeted advertising to underage users, the agencies alleged. The department cited internal communications from a TikTok employee acknowledging the conduct could get the company “in trouble” because of COPPA. TikTok let children bypass age restrictions and create accounts without age verification, DOJ said. Moreover, TikTok classified millions of accounts with an “age unknown” status, the filing said. “TikTok knowingly and repeatedly violated kids’ privacy, threatening the safety of millions of children across the country,” FTC Chair Lina Khan said in a statement. “The FTC will continue to use the full scope of its authorities to protect children online.” Principal Deputy Assistant Attorney General Brian Boynton said the complaint will “prevent the defendants, who are repeat offenders and operate on a massive scale, from collecting and using young children’s private information without any parental consent or control.” In a statement Friday, TikTok said it disagrees with the allegations, “many of which relate to past events and practices that are factually inaccurate or have been addressed.” TikTok offers “age-appropriate experiences with stringent safeguards,” proactively removes “suspected underage users” and has “voluntarily launched features such as default screentime limits, Family Pairing, and additional privacy protections for minors,” the company said. The FTC is seeking a permanent injunction and civil penalties of up to $51,744 per instance of violation. The commission voted 3-0 to refer the complaint to DOJ, with Commissioners Melissa Holyoak and Andrew Ferguson recused.
The 5th U.S. Circuit Court of Appeals should lift a preliminary injunction against Mississippi’s social media age-verification law, Mississippi Attorney General Lynn Fitch (R) argued in a filing Thursday (docket 24-60341) (see 2407290008). HB-1126 requires that social media platforms obtain parental consent to allow minors to access their services. NetChoice sued to block HB-1126 on free speech grounds and won a preliminary injunction from the U.S. District Court for Southern Mississippi on July 1 (see 2407160038). District Judge Halil Suleyman Ozerden on July 15 denied Fitch’s request to lift the injunction, finding NetChoice is likely to succeed on the merits of its First Amendment challenge. Fitch argued before the appeals court Thursday that the injunction rests on “facial claims that NetChoice failed to support.” Nothing in the law “facially” violates the First Amendment because it regulates online conduct, not online speech, said Fitch: The law’s “coverage turns on where harmful conduct toward minors online is most likely: the interactive social-media platforms that allow predators to interact with and harm children.”
Arkansas’ age-verification law violates the First Amendment and should be permanently enjoined, NetChoice argued Friday before the U.S. District Court for Western Arkansas in Fayetteville (docket 5:23-cv-05105). The court in August granted a preliminary injunction blocking the Social Media Safety Act (Act 689), concluding it “likely violates” the First Amendment and Due Process Clause. NetChoice said in its filing Friday that the state continues to rely on failed arguments that Act 689 is a narrowly tailored regulation of online conduct, not online speech. NetChoice argued courts have held the First Amendment can’t be evaded by regulating a “non-speech” component of a protected activity: For example, a law banning books through a restriction on the sale of ink is no less unconstitutional than a direct ban on book sales. NetChoice requested the court grant its motion for summary judgment.
The 5th U.S. Circuit Court of Appeals shouldn’t stay a lower court’s decision that temporarily enjoins a Mississippi law requiring kids younger than 18 to get parental consent before accessing social media, NetChoice said at the appeals court Friday. Mississippi Attorney General Lynn Fitch (R) appealed the U.S. District Court for Southern Mississippi preliminary injunction to the 5th Circuit earlier this month (see 2407030076 and 2407010062). The district court also denied Fitch’s request to stay that preliminary injunction (see 2407160038). Mississippi is incorrect that the law regulates conduct, not speech, NetChoice said. “The Act’s restrictions on protected speech are unconstitutional unless they survive strict scrutiny,” the tech industry group wrote. “They cannot, as the Act’s tailoring flaws preclude them from surviving any level of heightened First Amendment scrutiny. The Act restricts too much speech on too many websites where there are private alternatives to governmental regulation.”