Home Artificial Intelligence Privacy Violation Claims Will Test AI and Web-Based Technologies

Privacy Violation Claims Will Test AI and Web-Based Technologies

As we approach the end of another dynamic year in privacy, companies should consider proactively reviewing their privacy programs to account for these upcoming trends in privacy litigation, regulation, and enforcement.

With the availability of statutory damages and the dearth of binding case law, we expect claims for privacy violations of statutes based on use of internet technologies will increase in 2024.

Website operators saw a wave of privacy litigation in 2023 for violations of state and federal wiretapping statutes, such as the Electronic Communications Privacy Act and California Invasion of Privacy Act.

The lawsuits stemmed from use of commonplace internet technologies—cookies, web beacons, pixels, script, or software code—provided by third parties that gather information about users and their devices as they interact with their websites or mobile apps.

Initially, these actions were based on use of session replay technology. Over time, claims have expanded to include almost every type of internet tracking technology such as Google Analytics and the Meta Pixel.

These cases asserted one of three legal theories—that website chatbots illegally intercept contents of consumer communications, that technology illegally eavesdrops upon website visitors and mobile app users, or that tracking technology reveals the identity of anonymous consumers and thus constitutes doxing.

The latest iteration of these claims asserts that internet technologies function as illegal pen registers. Companies should consider auditing the types and functionality of technology running on their websites and mobile applications to determine whether proactive risk mitigation measures are available or advisable.

State Privacy Laws

Entering 2024, the US remains one of the only global economic powers that lacks a comprehensive, national framework governing the processing of personal information.

State legislatures have stepped in to fill the regulatory void through enactment of comprehensive privacy legislation at the state level.

California leads the way with the Consumer Privacy Act of 2018 and its privacy rights act ballot initiative in 2020. Virginia and Colorado passed their own legislation in 2021, Utah and Connecticut in 2022, and Delaware, Florida, Indiana, Iowa, Montana, Oregon, Tennessee, and Texas in 2023.

Despite rising concern from industry groups about the dangers of patchwork, divergent state privacy laws that raise compliance costs for businesses, the political divide in Congress makes federal comprehensive privacy legislation unlikely soon, particularly legislation that will preempt existing state laws.

State policymakers will probably advance legislation that paints inside the lines of established state privacy laws.

As more states follow suit, companies may want to consider taking an adaptive yet proactive approach to their privacy programs, including how to assess if laws apply, updating consumer notices and ways of offering choices and rights, and updating vendor contracts.

Prompt Enforcement

Just as the California was set to start enforcing updated CCPA regulations, a state court issued an injunction delaying enforcement until March 29, 2024.

Since then, the state attorney general and state’s privacy agency moved forward with enforcing statutory amendments to the CCPA that took effect Jan. 1, 2023 and weren’t subject to the injunction.

Their decision to press forward with enforcement sends a clear signal they also won’t delay enforcement of CCPA regulations that address data processing agreements, requirements surrounding honoring opt-out signals and providing opt-out mechanisms, dark patterns, and handling consumer data requests.

Given numerous compliance obligations imposed by revised CCPA regulations, expiration of the mandatory 30-day cure period for violations, and probability of prompt enforcement, companies should continue preparing for these requirements in advance of the March 2024 enforcement date.

Health Privacy Enforcement

The FTC is now a leading actor in the health privacy enforcement space. It took several actions in 2023 that put companies on notice that it’s focused on the flow of health information to third parties through tracking technologies integrated into websites and apps.

Between March and July, the FTC brought enforcement actions against BetterHelp, GoodRx, and Premom relating to use of these technologies. It also issued guidance and a joint letter with the Department of Health and Human Services’ Office for Civil Rights pertaining to use of online tracking technologies by covered entities and business associates.

The FTC also provided a policy statement and pursued additional enforcement aimed at companies’ unauthorized use and disclosure of consumer health data. These actions underscore the FTC’s position as a leading and active regulatory force for health data.

Companies that handle health data—as broadly defined by the Federal Trade Commission—should ensure their health data privacy and security programs are robust, and address risks posed by online data sharing.

Regulating AI

Increased AI regulation is now a matter of when, not if, with recent passage of the EU AI Act, and the continued push from members of Congress and the Biden administration for similar legislative protections in the US. Regulation at the state and federal levels is likely in 2024.

Following the unprecedented success of OpenAI’s launch of its generative AI ChatGPT platform in November 2022, AI took center stage in 2023, with industry giants like Amazon, Google, Meta, and Microsoft all announcing their own slew of AI services and products.

In the wake of this success, privacy concerns have arisen among consumer advocates, regulators, and legislators over use of personal information data to train AI products.

Companies should consider developing, implementing and enforcing clear governance policies and processes to proactively address these concerns in anticipation of robust legislation.

This article does not necessarily reflect the opinion of Bloomberg Industry Group, Inc., the publisher of Bloomberg Law and Bloomberg Tax, or its owners.

Author Information

Wynter L. Deagle is partner on Sheppard Mullin’s privacy and cybersecurity team.

Anne-Marie Dao and Dane C. Brody Chanove are associates on the team.

Write for Us: Author Guidelines

 

Reference

Denial of responsibility! TechCodex is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
DMCA compliant image

Leave a Comment