If a tool is giving you an answer that you know is not correct, would you not just turn to a different tool for an answer?
It's not like Bing forces you to use chat, regular search is still available. Searching "avatar 2 screenings" instantly gives me the correct information I need.
The point of that one, to me, isn't that it was wrong about a fact, not even that the fact was so basic. It's that it doubled and tripled down on being wrong, as parent said, trying to gaslight the user. Imagine if the topic wasn't such a basic fact that's easy to verify elsewhere.
Your problem is you want your tool to behave like you, you think it has access to the same information as you and perceives everything similarly.
If you had no recollection of the past, and were presented with the same information search collected from the query/training data, do you know for a fact that you would also not have the same answer as it did?
But people do seem to think that just because ChatGPT doesn't do movie listings well, that means it's useless, when it is perfectly capable of doing many other things well.
It's not like Bing forces you to use chat, regular search is still available. Searching "avatar 2 screenings" instantly gives me the correct information I need.