When facing a court case without the means to hire a lawyer, many individuals are drawn toward free assistance, including the use of generative artificial intelligence (AI). This trend has recently garnered attention, particularly in Australia, where judges like My Anh Tran of the County Court of Victoria have cautioned against the dangers of relying on AI for legal representation. As she aptly noted, “Generative AI can be beguiling, particularly when the task of representing yourself seems overwhelming.” However, the risks involved in using AI for legal matters can far outweigh its perceived benefits.
The Rise of Self-Representation
Self-representation in Australian courts has become increasingly common, often due to the rising costs of legal services. For instance, the Federal Circuit Court reported that a staggering 79% of litigants in migration cases were unrepresented in the 2023-2024 period. This trend underscores the urgent need for affordable legal solutions, as many self-represented litigants often possess valid claims but lack the resources to navigate the legal system effectively.
Generative AI and Legal Challenges
Since the launch of ChatGPT in late 2022, our research has uncovered 84 cases of generative AI usage in Australian courts, with a significant 66 of those involving self-represented litigants. These individuals are utilizing AI tools for various legal issues, including property disputes, employment problems, and migration cases. While the allure of AI is understandable, the consequences of its misuse can be severe.
The Risks of Relying on AI
One of the primary dangers associated with using generative AI for legal tasks is the generation of inaccurate or misleading information. Such inaccuracies can lead to the rejection of court documents and, ultimately, the loss of valid claims. Courts are obliged to dismiss evidence that lacks credibility, which could result in a litigant losing their opportunity to present their case. Additionally, self-represented litigants risk incurring costs orders against them, which means they could be liable for their opponent’s legal fees if their use of AI goes awry.
Legal Standards and Judicial Opinions
Judicial opinions on the use of AI in self-representation depict mixed sentiments. For example, during a recent decision in New South Wales, Chief Justice Andrew Bell acknowledged a self-represented litigant’s effort in using AI to assist her defense. However, he highlighted that the AI-generated submissions were “misconceived, unhelpful, and irrelevant,” showcasing the potential pitfalls of this approach.
Reducing Risks When Using AI
To minimize risks when considering generative AI, it is advisable for self-represented litigants to avoid using AI for legal research altogether. Several reputable resources are available for Australian law, including the Australasian Legal Information Institute (AUSTLII) and Jade. Public court libraries and law schools also provide extensive online materials and textbooks that guide individuals through legal research processes.
Guidelines from Australian Courts
In response to the growing reliance on AI, several Australian courts have issued guidance about the appropriate usage of generative AI. For instance, the Supreme Courts of Queensland, New South Wales, and Victoria have published recommendations that emphasize the need for accuracy in legal documentation. Always consult these guides before proceeding with an AI tool to understand its boundaries and requirements.
Verifying AI-Generated Information
If you are considering using generative AI, it’s crucial to independently verify any information it produces. This means searching for each case you plan to reference, ensuring its existence and confirming that its context aligns with your intended argument. Relying solely on AI outputs without cross-checking can lead to dire consequences.
Privacy Concerns with AI
Further caution is necessary regarding privacy. Queensland’s guidelines for self-represented litigants explicitly state that any private, confidential, or legally privileged information should never be entered into AI systems. This is vital to avoid unintentionally violating suppression orders or revealing sensitive personal information.
The Core of Legal Expertise
Conducting effective legal research and preparing court documents is inherently complex and best performed by trained lawyers. Legal practitioners possess the knowledge and experience necessary to navigate the intricacies of the law, highlighting the need for accessible and affordable legal services within the justice system. While AI may seem like a convenient solution to address the access-to-justice gap, it is ill-suited for these vital tasks—at least for now.
Special thanks to Selena Shannon from UNSW’s Centre for the Future of the Legal Profession for her contributions to this article.
Inspired by: Source

