Written by 11:18 am AI, Latest news

– Canadian Lawyer Faces Backlash for Submitting AI-Generated Fake Scenarios

Chong Ke, from Vancouver, under investigation after allegedly using ChatGPT to cite case law – but …

A Canadian lawyer is facing criticism for utilizing an artificial intelligence chatbot in legal research, which resulted in the creation of “fictitious” cases. This incident highlights the risks associated with employing untested technologies in the legal setting.

The lawyer in question, Chong Ke from Vancouver, is now the subject of an investigation due to her alleged use of ChatGPT, an AI tool developed by OpenAI, to assist in developing legal arguments for a child custody case in the British Columbia supreme court. Ke, representing a father involved in a custody dispute with the mother over an overseas trip with their children, reportedly requested ChatGPT to provide relevant case law examples. Subsequently, the chatbot generated three results, two of which were submitted as part of the legal submissions.

However, the opposing counsel representing the children’s mother could not verify the existence of the cases referenced by Ke despite diligent efforts.

When questioned about the discrepancies, Ke admitted her oversight, stating that she was unaware of the inaccuracies in the cases provided by the chatbot. She expressed regret for the mistake and clarified that there was no intention to mislead the court or opposing counsel.

Although AI chatbots like ChatGPT are widely used for their data-driven capabilities, they are susceptible to errors, including generating inaccurate or non-existent information, known as “hallucinations”.

The lawyers representing the mother criticized Ke’s actions as “reprehensible” and causing unnecessary delays and expenses in verifying the authenticity of the cited cases. They requested special costs as a form of reprimand. However, the presiding judge declined this request, emphasizing the need for clear evidence of misconduct to warrant such punitive measures.

Justice David Masuhara highlighted the seriousness of presenting fictitious cases in legal proceedings, stressing that it undermines the integrity of the judicial process and can potentially lead to miscarriages of justice. While acknowledging the negative repercussions of Ke’s actions, he noted her efforts to rectify the situation and acknowledged her sincere apology.

Despite the court’s decision not to impose special costs, the Law Society of British Columbia has initiated an investigation into Ke’s conduct. The society emphasizes the importance of adhering to ethical standards when utilizing AI tools in legal practice and expects lawyers to uphold the principles of professional conduct, even when leveraging technology for client services.

Visited 3 times, 1 visit(s) today
Tags: , Last modified: March 1, 2024
Close Search Window
Close