SEC settlement highlights continued enforcement focus on AI washing
Following through on its commitment to charge public companies for artificial intelligence (AI) washing, the US Securities and Exchange Commission (SEC) settled charges against a public company for allegedly misstating material facts about the company’s flagship AI in public statements.
The consent order against Presto Automation Inc., dated January 14, 2025, alleges that Presto made false and misleading statements to its investors regarding critical aspects of Presto Voice – an AI-assisted speech recognition technology used to automate drive-through order-taking at quick-service restaurants.
This order continues to place public companies on notice that statements made about AI technology and capabilities may need close scrutiny to ensure they are accurate and not misleading. While statements by former SEC officials have indicated that the SEC will continue to pursue these types of cases, Acting Chairman Mark T. Uyeda’s agenda and Paul Atkins’ agenda as the anticipated Chairman of the SEC are expected to change the priorities of the former SEC leadership and perhaps how aggressively the SEC may pursue such cases.
The consent order
The allegations in the order are two-fold: (1) Presto misrepresented to investors that Presto Voice was proprietary technology, and (2) after deploying a proprietary version of Presto Voice, the company misrepresented the technology’s capabilities. The order states that, from November 2021 to September 2022, Presto described Presto Voice to investors as “‘our’ technology and ‘Presto’s’ technology,” even though the underlying technology was owned and operated by a third party at the time of the statements. According to the SEC, Presto’s statements were “misleading” because they “created the false impression that [Presto Voice] was proprietary” technology.
Presto eventually deployed a proprietary version of Presto Voice to certain customers in September 2022, after which the company told investors that the technology “eliminated the need for human order taking.” The order alleges that such statements were false because the Presto Voice units powered by Presto’s proprietary technology “lacked the capability to take orders on their own and required substantial human involvement.” Presto knew that it lacked those capabilities, according to the allegations, as the company “hired, trained, and supervised human order takers located abroad (primarily in the Philippines and India), who processed the vast majority of drive-thru orders placed through Presto Voice.”
According to the order, Presto’s false and misleading statements violated Section 17(a)(2) of the Securities Act, Section 13(a) of the Exchange Act, and Rules 13a-11 and 13a-15(a) thereunder. The order also alleges that Presto failed to establish and maintain disclosure controls and procedures as required by Exchange Act Rule 13a-15(a). While it consented to the order, Presto neither admitted nor denied the allegations contained therein.
Based on Presto’s “current financial condition,” “remedial acts,” and “cooperation [with] the Commission staff,” the order requires only that Presto cease and desist further violations of federal securities laws. It does not impose a compliance consultant or monitor, seek disgorgement, or impose any civil penalty.
Key takeaways
The SEC is hyper-focused on AI-related disclosures from any regulated person or entity. Indeed, SEC officials have stated publicly that they are closely scrutinizing disclosures by public companies about their use of AI, and that AI disclosures are going to be a significant enforcement priority going forward.
This settlement makes true on those statements. Given the SEC’s focus on AI, public companies are encouraged to:
- Ensure disclosures related to AI are accurate: Prudent public companies will take proactive steps to ensure that public disclosures about AI and other emerging technologies are accurate and not misleading. It is encouraged to carefully scrutinize statements concerning the ownership and capabilities of technology, in particular, as such statements are likely to be material to investors.
- Strengthen internal controls: An aspect of this settlement was the company's failure to maintain effective disclosure controls and procedures. Clients are encouraged to assess whether their internal processes around disclosures are robust enough to meet the SEC’s expectations. This includes ensuring that there are adequate procedures for identifying, documenting, and reporting material information related to AI and other emerging technologies.
- Engage with experts proactively: Considering the ongoing scrutiny by the SEC, companies may consider engaging with legal and technical professionals to ensure that disclosures accurately describe the use of AI, its capabilities, and risks related to such disclosures. This may help ensure early identification of potential issues before they escalate into enforcement matters.
DLA Piper is here to help
DLA Piper’s team of lawyers, data scientists, and policy advisors assist organizations in navigating the complex workings of their AI systems to facilitate compliance with current and developing regulatory requirements. We continuously monitor updates and developments arising in AI and its impacts on industry across the world.
As part of the Financial Times’s 2023 North America Innovative Lawyer awards, DLA Piper was conferred the Innovative Lawyers in Technology award for its AI and Data Analytics practice.
For more information on AI and the emerging legal and regulatory standards, please visit DLA Piper’s focus page on AI.
Gain insights and perspectives that will help shape your AI Strategy through our newly released AI ChatRoom series.
For further information or if you have any questions, please contact any of the authors.