Just read an article about AI (Artificial Intelligence) and this line caught my eye:
That sounds a lot like some people I've met right down to the part about "hallucination."These systems can generate untruthful, biased and otherwise toxic information. Systems like GPT-4 get facts wrong and make up information, a phenomenon called “hallucination.”
jtk