International Trade Today is a Warren News publication.
ISP Impacts Seen

Calif. Privacy Agency Eyes Broad Rules on Automated Decision-Making

Future automated decision-making rules in California could have national impact on communications and internet companies, among many other industries, privacy experts said in interviews last week. The California Privacy Protection Agency board plans a Friday meeting to discuss an early proposal that the CPPA released last week. The proceeding is preliminary, with the agency saying it expects to formally begin the rulemaking next year.

Sign up for a free preview to unlock the rest of this article

If your job depends on informed compliance, you need International Trade Today. Delivered every business day and available any time online, only International Trade Today helps you stay current on the increasingly complex international trade regulatory environment.

Automated decision-making technology is "any system, software, or process -- including one derived from machine-learning, statistics, or other data-processing or artificial intelligence -- that processes personal information and uses computation as whole or part of a system to make or execute a decision or facilitate human decisionmaking," the CPPA draft said.

The state privacy agency is “positioning California to take a much broader approach than other states” on consumer “rights and protections for automated decision-making technology,” Keir Lamont, Future of Privacy Forum director-U.S. legislation, said in an interview. “What California does when it comes to regulating tech policy can have a substantial effect throughout the rest of the nation.” Many companies that operate nationwide are trying to build a single privacy management program that covers all requirements of the approximately one dozen comprehensive state laws, he said. “One state raising the bar in some way could have the impact of companies rolling out those new rights and protections across the country.”

The CPPA could make itself the primary regulator on automated decision-making, said Husch Blackwell’s David Stauss, a business privacy attorney. Consider last week’s draft a “high watermark, not a low watermark” on what the final rules will say. But unlike with legislation that might not pass, “the agency is going to do this,” he said. Based on how it handled drafts on other topics, the agency is likely to revise or remove some features of proposed rules before they become final, said Ballard Spahr’s Greg Szewczyk, another business privacy lawyer. “But we should probably assume that this gives a pretty good idea of what they’re looking to do.”

The California proposal takes an expansive view of systems that are covered and in what circumstances individuals could opt out, said Lamont. One “open question” that will likely spur debate during the rulemaking is whether the proposal covers “commonplace technologies,” such as calculators, spreadsheets or GPS systems that facilitate human decision-making. Second, while California follows some states that allow opt-out when automated processing is used to make a decision with legal or similarly significant effects, the proposal would also allow opt out from profiling when individuals are employees or students or when they are in a public place, Lamont said. In addition, it may allow individuals to opt out of behavioral advertising and having personal information used to train automated decision-making systems, he said.

The communications industry should pay attention to this issue, privacy experts said. “ISPs collect a lot of personal information” they can use for targeted ads, profiling and automated determinations, said John Davisson, senior counsel, Electronic Privacy Information Center (EPIC). He said the California proposal would require companies to provide an opt-out. Industry should watch what the agency does on targeted advertising because many internet and communications companies use first-party ads, agreed Lamont. Current rules under the California Consumer Privacy Act (CCPA) cover third-party ads only.

Internet companies should consider what the proposal means when it uses the words “denial of goods or services,” said Ridhi Shetty, policy counsel for the Center for Democracy and Technology (CDT) Privacy and Data Project. The draft requires businesses to notify consumers when an automated decision results in denial. For websites or ISPs, that could mean denying “basic features of a particular website or the basic provision of internet service, or it could mean other things depending how they’re defining it,” said Shetty. Also, she noted that many major ISPs share data about consumers’ internet activity with retail and social media websites for targeted advertising.

A possible rule requiring notification before businesses use automated decision-making may affect internet companies, said Davisson. The draft would stop businesses from describing their purpose for using the technology “in generic terms, such as ‘to improve our services,’” said Davisson: But ISPs and websites use language like that in privacy disclosures “all the time,” such as carriers saying they collect data to improve their networks.

Proposed rules on profiling in publicly accessible places could wrap in public Wi-Fi hot spots, Bluetooth technology, push notifications and facial recognition technology, said Stauss. While the proposal gives examples of physical locations, one might consider the entire internet as a publicly accessible place if defined broadly, he said. “It’s probably one of those things they will be asked to clean up.” Meanwhile, rules on what constitutes profiling in a job context could affect any company with employees in California, said Szewczyk: That could be a big compliance change for businesses.

The CPPA probably will adopt final rules on automated decision-making in 2024, the privacy experts agreed. But it could be two years before it can enforce them. Under terms of a state court ruling -- now under appeal (see 2307030025) -- the agency may have to wait one year -- until late 2025, said Lamont.

EPIC is “encouraged for the most part” by the proposal, which is likely to be influential given the size of California’s market, said Davisson. “They target profiling in a lot of different contexts and recognize the importance of giving consumers an opt out.” California’s proposal “puts greater fire under the feet of Congress” to make comprehensive privacy legislation that addresses possible AI harms, said Shetty: CDT wants rules that cover consumers nationally.