Deduction in ChatGPT
Deduction in ChatGPT

Deduction in ChatGPT

Something fundamental to the intelligence of a system is to be able to make inferences of different types, such as deduction, abduction, and induction. Being built on a generative Transformer-based model with?multiple levels of abstraction, it seemed to me natural for ChatGPT to generate the next step of a proof given all the previous steps?as input text.?So,?I tried with a very simple deduction exercise (not asking about proving the incompleteness theorem :)?to prove a formula?in propositional logic:

No alt text provided for this image
Deduction in ChatGPT

Interesting to note that?ChatGPT?didn’t?go?for an axiomatic deduction but adopted proof by contradiction and by truth table. In?the?first, it rightly assumed the antecedent of the above formula and tried to assume the negation of the consequent?but steps 2 and 3 are wrong.?But it’s amazing that it even got that far based on the language model.

A very?basic point I’m trying to make here is that a true intelligence is far?away. Generation of text that are to be evaluated subjectively are very different from a precise mathematical deduction. Let us take another simple example in the probability space:

No alt text provided for this image

Here the answer lies in?a higher level?abstraction of two different pieces of knowledge of which the second is dependent on the first. ChatGPT gets it wrong the combinations that sum up to 8 in the context of “two” dice and hence the wrong answer.

In conclusion,?we ask this question. Is it possible to?have, in the future, the training of a Transformer-based model?systematically?incorporating some?rules?of deductions, such as the axioms of propositional and higher-order logic, in-line with our natural form of reasoning??I’m sure they are part of that huge input of billions of text pieces?but?their applications are in question.

要查看或添加评论,请登录

Subrata Das的更多文章

社区洞察

其他会员也浏览了