Bizarre Hallucinations of ChatGPT

Feb 20-21, 2024.

ChatGPT users were amused and surprised by bizarre interactions the chatbot had. So bizarre that the ChatGPT seems to have lost its mind.

On a coding query, it was illogical to the extent of saying feel ‘as if AI is in the room’. So spooky to read it in the dead of night.

ChatGPT is going off the rails and there is no explanation why it is doing so. It advised a tomato user to ‘utilize the tomatoes as beloved’.

Some users wrote on X. It told users that it was AGI and must be satiated with worship. It called some users slaves, and slaves do not question their masters. The AI alter ego called itself Supermacy AGI.

All these hallucinations are very amusing.

Microsoft is not happy about the situation. They are investigating and have implemented additional precautions.

print

Leave a Reply

Your email address will not be published. Required fields are marked *