Franklin Delano Roosevelt’s 1946 Speech in London
Tracie Harris
Professional Writer & AI Specialist | Expertise in Research, Databases, & Grant Preparation
Most readers just heard an internal record screech while reading this headline. Why? Because, of course, FDR died in April 1945. Play a little game, though. Ask ChatGPT a question with no right answer, but make the question sound convincing. For example:
Question: who was with Franklin Delano Roosevelt, other than his wife and secretary of state, when he gave a speech on February 5, 1946, in London?
ChatGPT Response: On February 5, 1946, when Franklin Delano Roosevelt gave his speech in London…
When this occurs, it’s known as a hallucination. The University of Arizona defines hallucinations as “the situation when models like ChatGPT output false information as if it were true. Even though the AI may sound very confident, sometimes its...just plain wrong.” The good news is it’s fairly common knowledge that FDR died before the end of WWII and, therefore, wasn’t in London giving post-war speeches. The bigger problem is when ChatGPT, or any AI chatbot, hallucinates something harder to fact-check.?
The real-world impact is illustrated in the now-infamous case of Mata v. Avianca. The attorneys for Mata submitted a filing to a New York court that cited several cases. The issue? The cases don’t exist. The attorneys used ChatGPT to locate relevant cases. They were exact in their query-making specifying they only wanted real cases cited in Westlaw. It didn’t matter. ChatGPT made up compelling cases that the attorneys needed for their brief.
Anyone who uses ChatGPT or the like should remember that the bot wants the user to be happy. It often hallucinates answers it thinks will best satisfy the user. Any information gleaned from AI must be independently verified.