Canadian lawyer rapped over AI hallucinations
A Canadian lawyer is facing an investigation after being reprimanded by a court for making submissions which included references to “fictitious” cases invented by an AI chatbot.
A notice of application filed by Vancouver lawyer Chong Ke in a family law case last December made references to two cases which turned out to be “hallucinations” generated by ChatGPT.
Lawyers for the opposing side unsuccessfully sought a special costs order as a result of the incident.
In a ruling rejecting the order, Justice D.M. Masuhara said: “As this case has unfortunately made clear, generative AI is still no substitute for the professional expertise that the justice system requires of lawyers.
“Competence in the selection and use of any technology tools, including those powered by AI, is critical. The integrity of the justice system requires no less.”
The Law Society of British Columbia is now investigating Ms Ke’s conduct and could take disciplinary action.
A spokesperson said the Law Society “expects lawyers to comply with the standards of conduct expected of a competent lawyer if they do rely on AI in serving their clients”.