The subject line sounds like the beginning of a joke.  Unfortunately for them, it wasn't a joke.  See the news item below. There will be more news about this case later today. 

For anyone who may be interested in ChatGPT and related systems, you can check the slides and the video of a talk by my colleague Arun Majumdar and me on May 31.  For the slides by John Sowa, see  EvaluatingGPT--JohnSowa_20230531.pdf (ontologforum.s3.amazonaws.com) 

For the Video recording of both talks and a long Q/A discussion, see  https://ontologforum.s3.amazonaws.com/General/EvaluatingGPT--JohnSowa-ArunMajumdar_20230531.mp4

John
___________________________

New York lawyers blame ChatGPT for tricking them into citing ‘bogus legal research

Excerpts:

Attorneys Steven A. Schwartz and Peter LoDuca are facing possible punishment over a filing in a lawsuit against an airline that included references to past court cases that Schwartz thought were real, but were actually invented by the artificial intelligence-powered chatbot.

Schwartz explained that he used the groundbreaking program as he hunted for legal precedents supporting a client's case against the Colombian airline Avianca for an injury incurred on a 2019 flight. The chatbot, which has fascinated the world with its production of essay-like answers to prompts from users, suggested several cases involving aviation mishaps that Schwartz hadn't been able to find through usual methods used at his law firm.

The problem was, several of those cases weren't real or involved airlines that didn’t exist.  Schwartz told Judge P. Kevin Castel he was “operating under a misconception ... that this website was obtaining these cases from some source I did not have access to.”
He said he “failed miserably” at doing follow-up research to ensure the citations were correct. “I did not comprehend that ChatGPT could fabricate cases,” Schwartz said.

The judge confronted Schwartz with one legal case invented by the computer program.  It was initially described as a wrongful death case brought by a woman against an airline only to morph into a legal claim about a man who missed a flight to New York and was forced to incur additional expenses. “Can we agree that's legal gibberish?” Castel asked.

The judge said he'll rule on sanctions at a later date.

Source: https://www.nbcbayarea.com/news/national-international/new-york-lawyers-blame-chatgpt-for-tricking-them-into-citing-bogus-legal-research/3248139/