WA lawyer referred to regulator after preparing documents with AI-generated citations for nonexistent cases
Judge warns of âinherent dangersâ of lawyers relying solely on AI amid growing number of fake citations or other errors due to the technology
A lawyer has been referred to Western Australiaâs legal regulator after using artificial intelligence in preparing court documents for an immigration case. The documents contained AI-generated case citations for cases that did not exist.
It is one of more than 20 cases so far in Australia in which AI use has resulted in fake citations or other errors in court submissions, with warnings from judges across the country to be wary of using the technology in the legal profession.
In a federal court judgment published this week, the anonymised lawyer was referred to the Legal Practice Board of Western Australia for consideration and ordered to pay the federal governmentâs costs of $8,371.30 after submissions to an immigration case were found by the representative for the immigration minister to include four case citations that did not exist.
Justice Arran Gerrard said the incident âdemonstrates the inherent dangers associated with practitioners solely relying on the use of artificial intelligence in the preparation of court documents and the way in which that interacts with a practitionerâs duty to the courtâ.
The lawyer told the court in an affidavit that he had relied on Anthropicâs Claude AI âas a research tool to identify potentially relevant authorities and to improve my legal arguments and positionâ, and then used Microsoft Copilot to validate the submissions.
The lawyer said he had âdeveloped an overconfidence in relying on AI tools and failed to adequately verify the generated resultsâ.
âI had an incorrect assumption that content generated by AI tools would be inherently reliable, which led me to neglect independently verifying all citations through established legal databases,â the lawyer said in the affidavit.
The lawyer unreservedly apologised to the court and the ministerâs solicitors for the errors.
Gerrard said the court âdoes not adopt a luddite approachâ to the use of generative AI, and understood why the complexity of migration law might make using an AI tool attractive. But he warned there was now a âconcerning numberâ of cases where AI had led to citation of fictitious cases.
Gerrard said it risked âa good case to be undermined by rank incompetenceâ and the prevalence of such cases âsignificantly wastes the time and resources of opposing parties and the courtâ. He said it also risked damage to the legal profession.
Gerrard said the lawyer did ânot fully comprehend what was required of himâ and it was not sufficient to merely check that the cases cited were not fake, but to review those cases thoroughly.
âLegal principles are not simply slogans which can be affixed to submissions without context or analysis.â
There have been at least 20 cases of AI hallucinations reported in Australian courts since generative AI tools exploded in popularity in 2023.
Last week, a Victorian supreme court judge criticised lawyers acting for a boy accused of murder for filing misleading information with the courts after failing to check documents created using AI.
The documents included references to nonexistent case citations and inaccurate quotes from a parliamentary speech.
There have also been similar cases involving lawyers in New South Wales and Victoria in the past year, who were referred to their stateâs regulatory bodies.
However, the spate of cases is not just limited to qualified lawyers. In a NSW supreme court decision this month, a self-represented litigant in a trusts case admitted to the chief justice, Andrew Bell, to have used AI to prepare her speech for the appeal hearing.
Bell said in his judgment that he was not criticising the person, who he said was doing her best to represent herself. But he said problems with using AI in preparing submissions were exacerbated when the technology was used by unrepresented litigants âwho are not subject to the professional and ethical responsibilities of legal practitionersâ.
He said the use of generative AI tools âmay introduce added costs and complexityâ to proceedings and âadd to the burden of other parties and the court in responding to itâ.
âNotwithstanding the fact that generative AI may contribute to improved access to justice, which is itself an obviously laudable goal, the present case illustrates the need for judicial vigilance in its use, especially but not only, by unrepresented litigants.â
The Law Council of Australiaâs president, Juliana Warner, said sophisticated AI tools offered unique opportunities to support the legal profession in administrative tasks, but reliance on AI tools did not diminish the professional judgment a legal practitioner was expected to bring to a clientâs matter.
âWhere these tools are utilised by lawyers, this must be done with extreme care,â she said. âLawyers must always keep front of mind their professional and ethical obligations to the court and to their clients.â
Warner said courts were regarding cases where AI had generated fake citations as a âserious concernâ, but added that given the widespread use of generative AI, a broadly framed prohibition on its use in legal proceedings would be âneither practical nor proportionate, and risks hindering innovation and access to justiceâ.