Is It Safe To Use ChatGPT To Write Legal Documents?
Table of Contents
- What are ChatGPT hallucinations?
- Is it wrong to use AI to create legal documents?
- Is there any guidance available for using AI to create legal documents?
- Wrapping up
There has been another recent report about a lawyer getting into strife after using ChatGPT to create a legal document. Last month, an Australian Solicitor filed documents for an immigration case which contained citations to cases which were “non-existent” according to the Judge. The lawyer has been referred to the Office of the NSW Legal Services Commissioner (OLSC). He admitted to using ChatGPT to prepare a summary of the cases. The AI chatbot “hallucinated” entirely made-up cases and quotes which the lawyer incorporated into his submissions without checking the details.
“He accessed the site known as ChatGPT, inserted some words, and the site prepared a summary of cases for him,” the judgment reads. “He said the summary read well, so he incorporated the authorities and references into his submissions without checking the details.”
In February 2024, a Vancouver lawyer was told she had to personally compensate her client’s ex-wife’s lawyers for the time it took them to discover the cases she planned to use were ChatGPT hallucinations. In the British Columbia Supreme Court, Justice David Masuhara wrote:
“As this case has unfortunately made clear, generative AI is still no substitute for the professional expertise that the justice system requires of lawyers. Competence in the selection and use of any technology tools, including those powered by AI, is critical.”
What are ChatGPT hallucinations?
Hallucinations can be defined as Large Language Models or LLMs generating “content that is nonsensical or unfaithful to the provided source content”. Although LLMs are trained on vast amounts of data, they can find it challenging to interpret and use natural human language. And remember, LLMs have no idea that they are providing the wrong answer. They are not ‘intelligent’ in a way that allows them to have independent thought. They simply predict the next word that most likely follows the one it has just written.
Is it wrong to use AI to create legal documents?
There is no regulation forbidding legal professionals from using AI in their work. In fact, in March 2024, in a speech to the Manchester Law Society, Sir Geoffrey Vos, Master of the Rolls and Head of Civil Justice, was encouraging as to its use, commenting:
“LLMs are very good at suggesting draft contracts. I have been truly amazed at how quickly they can be produced. Of course, they need checking and amending, but that process takes a fraction of the time it would take a lawyer to draft a contract from scratch.
There is no doubt that AI can save a vast amount of time in creating legal contracts for employment, company takeovers or sale and purchase agreements, just as examples. It will take time to get used to, and, as I have said, requires careful checking.”
Regarding the use of AI to create court documents, Sir Geoffrey admitted that the risk of hallucinations is great when using public LLMs but:
“…it is very likely that specialist large language models trained on specialist legal data will be more accurate for lawyers. Such things already exist. One is called Spellbook. But they are not yet commonplace. There is, however, no reason why an AI could not be trained only on, for example, the 6,000 pages of the CPR or on the National Archives case law database, BAILLI, Westlaw, or Lexis Nexis, but unable to scrape the bulk of the internet. Such a tool would be likely to give answers that would be more accurate and useful than a public LLM.”
Is there any guidance available for using AI to create legal documents?
The Bar Council has created helpful guidance on the use of AI for providing legal services, including how to avoid breaching confidentiality, IP infringement, legal privilege, and presenting incorrect information.
Sam Townend KC, chair of the Bar Council, told Legal Cheek:
“The growth of AI tools in the legal sector is inevitable and, as the guidance explains, the best-placed barristers will be those who make the efforts to understand these systems so that they can be used with control and integrity. Any use of AI must be done carefully to safeguard client confidentiality and maintain trust and confidence, privacy, and compliance with applicable laws.”
In late 2023, the SRA published a risk report on the use of AI by Solicitors, highlighting the benefits and risks of the technology.
Wrapping up
AI tools, including LLMs, can be of enormous benefit to consultant solicitors who are often short on time. But if you do choose to use them, it is crucial to check what they produce line by line. Because thankfully, at this stage at least, we can safely say that AI cannot replace the knowledge, experience, and empathy of a human barrister or solicitor.
If you’re thinking of launching your own legal practice in 2025, please give us a call. We would love to hear about your commercial ambitions and tell you how we can support them.
Schedule a call or fill in our contact form.
We look forward to celebrating your success.