An AI hallucination is when an artificial intelligence system, including GenAI models, produces outputs that are “made up”, or exhibit creative, imaginative, and often unrealistic content that was not present in the training data, showcasing the model’s ability to generate novel and sometimes fantastical outputs. This can become a problem if the user of the AI model is not trained to identify and rectify hallucinations.