16 December 2025

Can legal advice prepared with the help of Generative AI attract privilege?

Yes, in certain circumstances.

 

What is the legal position?

Absent a test case or legislative reform, the existing legal framework could protect privilege in communications created by, or with the assistance of, “Generative AI Systems” (ie a form of AI System which specifically generates new content, including text, images, sounds and computer code in response to user prompts1, creating new data that has similar characteristics to the data it was trained on, resulting in outputs that are often indistinguishable from human-created media2. In our view:

  • There is no standalone “AI Privilege”: Legal advice generated by a Generative AI System and provided directly to a non-lawyer is not capable of being privileged, as the Generative AI System is not a lawyer.
  • Lawyers, including in-house counsel, should be able to use Generative AI Systems to formulate and communicate legal advice in a way that maintains privilege. We consider that AI Systems should be treated like a “subordinate” of the lawyer, much like a trainee solicitor, pupil barrister, or paralegal working under the “direction and supervision” of the lawyer.
  • If the AI System produced advice, and a qualified lawyer (in-house or otherwise), acting in accordance with their duties carefully reviewed that advice, it could benefit from the protection of privilege, as the lawyer effectively authorises, agrees with, and adopts its content, making it akin to the lawyer’s own work product.
  • For legal advice generated with the assistance of an AI System to be privileged and to retain privilege, and for lawyers (in-house and in private practice) to comply with their regulatory duties, it is imperative that such tools are used responsibly and with the relevant guardrails in place.

How the courts will treat AI in the context of privilege remains to be seen. While we await a test case or a change in the law, in-house counsel and their advisers should look to the analogies from existing case law as well as current judicial and industry guidance to benefit from the significant potential of AI, without risking or waiving their client’s fundamental human right to privilege.

 

What are some points of best practice?

The following points of best practice may ensure that legal advice prepared with the help of Generative AI can attract privilege:

  • Consider whether the legal task is appropriate for an AI System.
  • Public AI Systems (ie free versions of popular AI Systems, where data may be used to train the underlying model for wider use, whereby users are typically not able to negotiate the terms of service or licence) should not be used to generate or assist with generating legal advice. Such activities should be reserved for Bespoke AI Systems (ie AI Systems which offer bespoke contractual protection which may include terms that limit or prohibit the AI System’s ability to use inputs to improve or otherwise train its model).
  • Where Generative AI Systems are used to assist with the production of legal advice, such use should always be under the direction and supervision of a qualified lawyer.
  • Where the context requires, consider keeping a record of AI use to demonstrate the degree of supervision and direction exercised in the production of legal advice.
  • Ensure non-lawyer employees are aware that any legal advice sought or given from an AI System will not be privileged absent the involvement or a lawyer or a substantive change in law.
  • Where possible, create, disseminate and supervise consistent use policies on AI Systems internally.

 


1Adapted from the definition given in Courts and Tribunals Judiciary, Artificial Intelligence in (AI) Guidance for Judicial Office Holders, dated 14 April 2025, accessed 16 July 2025.
2Adapted from the definition given by the Alan Turing Institute, Data Science and AI Glossary, accessed 17 July 2025. Emphasis added.

Print