dark building ceiling with lights

4 November 2025

Antitrust meets AI: Plaintiffs, enforcers, and legislatures take aim at alleged AI-driven collusion

California Governor Gavin Newsom recently signed Assembly Bill 325 (AB 325), amending the state’s Cartwright Act to explicitly prohibit the use or distribution of “common pricing algorithms” that facilitate anticompetitive practices.

Effective January 1, 2026, the law targets algorithms that enable competitors to align prices by sharing sensitive data, imposing heightened civil and criminal penalties. This measure arrives amid a surge in federal and state antitrust scrutiny of algorithmic price fixing, including high-profile class actions and government probes into sectors such as real estate, construction equipment rentals, health insurance, and mortgage lending.

While California is the first state to integrate such prohibitions directly into antitrust statutes, similar initiatives are emerging elsewhere – signaling a broader regulatory shift against artificial intelligence (AI)-driven collusion.

Overview of AB 325

AB 325 updates California’s antitrust framework to address modern digital tools that some have alleged enable illegal price fixing. The law makes it unlawful for any person to use or distribute a common pricing algorithm if: (1) it is part of a contract, combination, or conspiracy in restraint of trade or (2) the distributor coerces another to adopt prices or commercial terms generated by the algorithm. A “common pricing algorithm” is defined as software that sets or recommends prices based on competitor data, such as costs, inventories, or market strategies.

This prohibition is broad in several respects. First, the law does not distinguish whether the algorithm relies on public rather than nonpublic data, either in the runtime operation or training of the algorithm itself. The Floor Analysis for the bill explained, “the bill applies regardless of whether the underlying data is public or private, reflecting the understanding that even public data can enable collusion when processed similarly across competitors.” Thus, AB 325 simply prohibits using “competitor data” in an algorithm that affects price or a “commercial term.”

Second, the definition of “price” includes “compensation paid to an employee or independent contractor,” raising implications for California’s labor market and the manner in which salaries are set by employers.

Third, the prohibition applies to a common pricing algorithm that would “otherwise influence” a “commercial term,” without defining these phrases.

The law does not appear to target proprietary, internally developed algorithms so long as that algorithm is solely used internally. The bill’s Floor Analysis notes:

“[AB 325] is structured to avoid interfering with ordinary or beneficial uses of pricing software. It targets only those situations where separate firms use shared algorithms, consistent with antitrust law’s focus on preserving ‘independent centers of decisionmaking.’ Businesses that develop or use their own proprietary pricing tools remain unaffected.”

That said, companies engaged in enterprise or platform pricing are likely at risk of being targeted under the second prong of the statute, which makes it unlawful to provide a pricing algorithm that “coerces” another to adopt its recommended prices or commercial terms. The statute does not provide a clear definition on what constitutes “coercion.” Not only does the legislation not define the term, but historically courts have evaluated the existence of coercion in antitrust matters on a case-by-case basis. Against that backdrop, “coercion” could be interpreted to reach common incentive structures, including “give-to-get” data arrangements in which companies agree to provide their own data to data aggregators in exchange for industry-wide data.

The legislation will also likely increase litigation surrounding these issues. It empowers the California Attorney General and private plaintiffs with enhanced enforcement tools such as cumulative remedies under companion bill SB 763, which increases criminal and civil penalties for antitrust violations.

AB 325 also amends the pleading threshold for private plaintiffs in Cartwright cases. A complaint now need only allege plausible facts showing the existence of a contract, combination, or conspiracy to restrain trade. Under AB 325, plaintiffs no longer have to plead facts ruling out the possibility of independent action. This shift may significantly increase litigation exposure for businesses that rely on shared data, pricing vendors, or third-party technology to set prices. Proponents argue it protects consumers from inflated prices in industries reliant on algorithmic software, while critics warn of potential overreach affecting legitimate AI applications.

Broader United States context: Lawsuits and investigations

AB 325 reflects a national wave of antitrust scrutiny of algorithmic collusion, when competitors allegedly use third-party software to share confidential data and coordinate pricing. This trend gained momentum with the US Department of Justice’s (DOJ) August 2024 lawsuit against RealPage, a real estate software provider accused of enabling landlords to align rents through its algorithmic pricing tool.

The DOJ alleged RealPage’s system processes nonpublic data from multi-family housing operators to recommend supracompetitive prices, therefore harming millions of renters. Class actions against RealPage have progressed, with courts denying motions to dismiss and with settlements exceeding $100 million in some cases. State attorneys general in Arizona, Washington, and elsewhere have joined similar claims.

Similar scrutiny extends to Yardi Systems, another property management software firm. In December 2024, a US district judge denied motions to dismiss class actions alleging Yardi’s tools facilitate rent fixing by exchanging sensitive data among landlords. The plaintiffs claim Yardi’s revenue management software inflates prices through algorithmic coordination, mirroring RealPage’s model. The court applied a “per se” illegality standard to these claims, emphasizing that algorithmic facilitation of price fixing constitutes antitrust harm.

In the construction sector, lawsuits filed in early 2025 focus on national equipment rental companies, along with software provider Rouse Analytics. These cases allege a cartel using Rouse’s platform to share nonpublic inventory and pricing data, generating “benchmark” rates that artificially inflate rentals nationwide. The Judicial Panel on Multidistrict Litigation consolidated these actions in Illinois federal court in August 2025.

In the health insurance industry, plaintiffs have alleged that MultiPlan, Inc. and major health insurance companies colluded to fix reimbursement rates for out-of-network healthcare services via MultiPlan’s algorithm. The plaintiffs claim that MultiPlan collects confidential pricing data from insurers, uses its algorithm to generate reimbursement rates, and allows insurers to adopt these coordinated rates, thereby reducing competition and harming the healthcare providers by suppressing reimbursement rates.

Most recently, in October 2025, a class action was filed against Optimal Blue, LLC and 26 major mortgage lenders. The suit accuses Optimal Blue’s software of enabling lenders to fix mortgage rates by sharing real-time pricing data, inflating costs for millions of homebuyers since 2019. The plaintiffs seek damages and injunctive relief, drawing parallels to the RealPage case and underscoring the expansion of algorithmic antitrust claims to financial services.

These cases illustrate the intense interest of the antitrust plaintiffs’ bar and government enforcers in challenging perceived algorithmic price fixing across many industries.

Similar legislation in other states

While California pioneered integrating algorithmic prohibitions into state antitrust law, other jurisdictions are following suit, particularly targeting rental pricing software. Ten days after California enacted AB 325, New York Governor Kathy Hochul signed S7882 into law.

S7882 prohibits residential rental property owners or managers from using pricing software to set tenant rents or occupancy rates. New York also passed an algorithmic pricing disclosure law earlier this year – which survived a First Amendment challenge in October 2025 – requiring disclosure when prices are set using consumer data. Colorado passed similar legislation used to set rents earlier this year, but the bill was ultimately vetoed.

Several other states – including New Jersey, Massachusetts, and Pennsylvania – have bills in committee aimed at regulating rental pricing algorithms. At least six state legislatures currently in session are considering bills targeting the residential rental market. Another 12 legislatures adjourned with bills pending in committee. Similar legislation failed in Maine, New Hampshire, Virginia, and New Mexico.

On the local level, Seattle and San Francisco enacted a ban on algorithmic rent-setting tools using competitor data. New York state has also enacted such a ban. San Diego’s city council is drafting an ordinance to prohibit such tools in housing.

Implications and recommendations

These developments signal a new focus on antitrust in litigation, enforcement, and legislation to address the rise of AI on the state level.

Businesses are encouraged to audit software for compliance and carefully monitor evolving case law. As this trend develops, proactive antitrust compliance is important to mitigate risks.

Some key considerations include:

  • Identify all algorithmic pricing tools: Identify all tools that affect price, commercial terms, or employee pay in California for legal review. Note any tool used by more than one firm and any tool that uses competitor data, using public information or not.

  • Update disclosures: Identify any prices that are set using consumer data in New York to ensure compliance with New York’s disclosure requirements.

  • Update compliance policies: In fall 2024, the DOJ Antitrust Division updated its compliance standards to include AI and algorithmic revenue management software. Compliance policies should be reviewed and updated.

  • Update contracts: Review current contracts concerning the aggregating or reuse of your nonpublic data.

  • Demonstrate independent decisionmaking: Maintain records that establish that pricing and other decisions are the product of independent human decisionmaking.

  • Obtain legal guidance: Consult experienced antitrust counsel to evaluate exposure under the new law and the Sherman Act.

DLA Piper’s Antitrust and Competition group has deep experience representing clients in litigation and investigations related to alleged algorithmic price fixing. DLA Piper also helps clients navigate antitrust compliance in the AI ecosystem.

For more information, please contact the authors.

Print