English
Share
Sign In
ChatGPT fools lawyers ๐Ÿง‘โ€โš–
Haebom
2
๐Ÿ‘
A lawyer was caught citing six nonexistent precedents researched by ChatGPT in a New York state court case.
China Southern Airlines and Roberto Mata (passenger) have been in litigation since 2019
The lawsuit alleges that Mata was injured while flying China Southern Airlines and that the injury was due to the airline's negligence.
Lawyer to be disciplined for filing false documents with court
ChatGPT reported the following six cases. However, none of them actually occurred.
Varghese v. China South Airlines, Martinez v. Delta Airlines, Shaboon v. EgyptAir, Petersen v. Iran Air, Miller v. United Airlines, and Estate of Durden v. KLM Royal Dutch Airlines
What is even more surprising is that the lawyer who submitted this document is not a junior but has 30 years of experience.
Steven Schwartz, a 30-year veteran, is not the lead attorney on this defense team, but rather an assistant.
He said he gave the materials to Peter Loduca, the main lawyer, to research and prepare the complaint.
Peter Loduca, a 30-year lawyer who also did not check this
The lawyer asked ChatGPT, "Is this a real precedent?" and he said, "Yes," so he believed it and submitted it.
In unprecedented situation, court orders Schwartz to explain ( related article )
The submitted document contains a record of asking chatGPT for the source and authenticity (hallucination)
Subscribe to 'haebom'
๐Ÿ“š Welcome to Haebom's archives.
---
I post articles related to IT ๐Ÿ’ป, economy ๐Ÿ’ฐ, and humanities ๐ŸŽญ.
If you are curious about my thoughts, perspectives or interests, please subscribe.
Would you like to be notified when new articles are posted? ๐Ÿ”” Yes, that means subscribe.
haebom@kakao.com
Subscribe
2
๐Ÿ‘
    ํ”ผ
    ํ”ผ์นด๋ถ€
    ์ „๋ฌธ์ง์„ 30๋…„ํ•œ ์‚ฌ๋žŒ๋„ ์ด์ •๋„ ์‹ ๋ขฐ๋ฅผ ํ• ์ •๋„๋กœ ๋งˆ์ผ€ํŒ…์ด ์ž˜(?) ๋˜์—ˆ๋„ค์š”...ใ…‹ใ…‹ใ…‹
    Eric
    ๊ธฐ์ˆ  โ†’ ๋ฌธ์ œ โ†’ ํ•ด๊ฒฐ์ฑ… โ†’ ๊ฐœ์„  ๋ฃจํ”„ ๋„๋Š” ์†๋„๊ฐ€ ๋นจ๋ผ์ง€๊ฒ ๊ตฐ์š”