What happens within the legal sector tends to be understood in its entirety by those directly involved, avoiding the disclosure of potentially sensitive information to the general public. However, since the advent of artificial intelligence, several lawyers have begun to turn to this technology, framing it as a useful tool to support their arguments. Unfortunately, AI is much less reliable than one might imagine.
Damien Charlotin, a researcher specialising in law, data science and artificial intelligence, has built and published a database containing countless cases of hallucinations generated by artificial intelligence in the legal field. The crowdfunded document contains 410 cases from around the world, including 269 from the United States.
The database in question was subsequently used as a starting point for an investigation by 404 Media, the news outlet responsible for today's article, aimed at investigating the “justifications” presented by lawyers caught red-handed. The most common excuses mention computer problems, personal and family emergencies, negligence and, above all, their assistants' lack of responsibility.