top of page
Ravit banner.jpg

RESOURCES

Let's stop using the term ChatGPT "hallucinations"

Using this term is harmful. Two harms and an alternative.


➤ When ChatGPT spits out false information, many people call it "hallucinations".


➤ Wikipedia defines AI hallucinations as


👉"a confident response by an AI that does not seem to be justified by its training data."


👉"For example, a hallucinating chatbot with no knowledge of Tesla's revenue might internally pick a random number (such as "$13.6 billion") that the chatbot deems plausible, and then go on to falsely and repeatedly insist that Tesla's revenue is $13.6 billion, with no sign of internal awareness that the figure was a product of its own imagination"


➤ HARM 1


👎Did you notice how this definition attributes human qualities to the AI? For example, in this definition, AI has an imagination.


👎The use of humanizing terms can mislead people into thinking of the AI as a human-like agent.


💪To emphasize -- AIs are absolutely NOT human-like entities of the kind that we've seen in Sci-Fi movies.


➤ HARM 2


👎Did you notice how this definition shifts the blame for the misinformation?


👎It's as if the false information is the AI's fault.


👎But it is the fault of the company that made the AI.


💪To emphasize -- companies that make technology are responsible for what their technology does.


➤ ALTERNATIVE


👍Let's use "misinformation" instead of "hallucinations". It's the term we use for other cases of false or inaccurate information.


👍The term "misinformation" comes with a host of connotations about how the bad information can harm society. It's important to keep these risks in mind in the context of chatbots.


👍We can add a modifier for the AI case. For example:

Training-failure misinformation

Coding-failure misinformation


👍Terms like these emphasize the societal harm and the responsibility of the people behind it.


➤ Join the discussion of this topic on the LinkedIn post, here.

Kommentarer


FOR UPDATES

Join my newsletter for tech ethics resources.

I will never use your email for anything else.

bottom of page