Never "trust" ChatGPT.
For the love of God and humanity, never TRUST AI like ChatGPT.
it does not actually understand your question
It is a great tool, but it does not actually understand your question!! It just matches your question to similar questions it knows about (simplification, but still accurate in its essence) and makes up an answer by combining answers it has seen.
it makes up answers...
Basically, it makes up answers. This is why it cannot even count characters in a sentence ( this was improved recently but only due to human intervention - perhaps I will write about it one day)
It is extremely dangerous - not because it will take our jobs, but it makes catastrophic mistakes and it has no ability to self correct.
Here is one example that illustrates why it cannot be trusted
I was experimenting with latest ChatGPT 4o (which is very impressive indeed!) and wanted to see what it knows about a math learning website I have built - LearnMath123 - a pet project of mine.
Surprisingly it knows something about it and it is pretty accurate:
领英推荐
But not 100% accurate. Why?
Fortunately, ChatGPT now tells you the sources of this information. A very useful feature indeed, as it allows us to see that it in fact used a totally unrelated website to get information. No human being would make this mistake!
No human being would make this mistake!
Does this mean ChatGPT should not be used? No, but take it for what it is, just a research tool.
No doubt someone will point out that humans will make other types of mistakes and are equally as unreliable. Indeed, never trust a single person either. But human beings, especially multiple actors, do not suffer from this fatal flaw.
But don't take my word, take ChatGPT's warning seriously. It is not just a legal disclaimer
P.S. If you are interested in learning how different Artificial "Intelligence" is from human intelligence, please read this.
It is important to remember that it does not comprehend in any real sense. The incorrect or inaccurate response is due presumably to the lack of articles on the specific topic. But it fails to make a reasonable guess, since it does not understand the question. Using Microsoft Copilot in this case: