UK court warns of AI misuse in legal practice

The High Court of England and Wales has issued a judgment raising serious concerns about the misuse of generative artificial intelligence in legal practice. The ruling was prompted by two cases heard under the court’s Hamid jurisdiction: Ayinde v London Borough of Haringey and Al-Haroun v Qatar National Bank, Kazinform News Agency correspondent reports.

photo: QAZINFORM

In both cases, the court found that lawyers or their clients had submitted documents containing fictitious legal authorities, apparently generated by AI tools such as ChatGPT.

False citations and non-existent cases

In the Ayinde case, barrister Sarah Forey appeared to have cited fabricated legal precedents in documents relating to a dispute over temporary housing. The court determined that none of the five referenced cases actually existed, and noted that the phrasing used had a formulaic tone characteristic of machine-generated text.

In the parallel case of Al-Haroun v Qatar National Bank, solicitor Abid Hussain submitted court documents referencing 45 legal authorities, 18 of which were found to be entirely fictitious, while many of the remaining citations were either misquoted or taken out of context.

According to the court, the citations had been provided by the claimant himself, who had used publicly available AI tools. However, the court stressed that Hussain was personally responsible for verifying each citation and had failed in his basic professional duties.

AI is no excuse for neglecting responsibility

The court, presided over by the President of the King’s Bench Division, emphasized that while AI can serve as a useful tool in legal work, its output must be subjected to careful review and used within the framework of professional standards.

Special attention was given to the problem of “hallucinations” - a phenomenon where AI generates plausible-sounding but entirely false content, including invented case names and quotations. The court noted that similar incidents have already been reported in the UK, the US, and other jurisdictions.

Consequences

In both cases, the court imposed sanctions. Sarah Forey and the Haringey Law Centre were each ordered to pay £2,000, and both matters were referred to their respective professional regulators, the Bar Standards Board and the Solicitors Regulation Authority.

In addition, the court called on legal institutions, including the Bar Council, the Law Society, and the Council of the Inns of Court, to urgently review their guidance and oversight mechanisms for the use of AI in legal practice.

Earlier, Kazinform News Agency reported that ChatGPT may sacrifice user safety for self-preservation.