Can a horse outrun a cheetah?

Can a horse outrun a cheetah?

I'm referring to Us v/s Computers. Human beings are complex creatures. With approximately 30 billion neurons and an equal number of glial cells, more than a trillion synaptic connections, we are undoubtedly the most complex form of intelligence among living creatures. Enter computers. Thirty years ago, computers were very straightforward to understand. Silicon and Software. Today, the same has grown into a multitude of abstraction layers, each with its own complexity. It ain't hardware and software anymore. You have networks, communication, personalization, the user interface, etc. packed into the same stack. Neural networks with 3 layers have deepened to 1000's of layers, making them more complex.

We all know that computers are good at math. They are cheetahs when it comes to this. However, when it comes to reasoning they are horses. Recently, Google's Deepmind demonstrated 2 AI computers called Alphaproof which could reason well for a math bar exam. They got 4 of the 6 questions right. Not bad. The AI was a mix of neural networks and symbolic logic. No matter if the datasets that you train your AI on consist of a humongous amount of data from the internet, soon you are going to run out of it. Your only resort then is synthetic data. All this talk about "the more the data, the better it is" is coming to a grinding halt. You can make a donkey into a horse but not a dexterous monkey. The missing link - the Algorithm.

The problem with internet data is that it teaches the 'know-what' part of the equation to an AI , but not the 'know-how', which is often tacit. We learn from our experiences. And we extrapolate very well. Computers don't have first hand experience of learning in the real world. That's why they are so ridiculous trying to walk or cook a dish. In these senses, we are the cheetahs. In the future (maybe 10 years from now), they will be equally good at this. In the real world we have cliche's like 'The more the capability, the costlier the computer'. H-100's are costlier than A-100's because of higher throughput. We understand the context, but for an AI, it's more or less. GPT's like GPT-2 and GPT-3 were dense models (data intensive). The newer versions of LLM's work with MOE (Mixture of Experts) which means smaller LLMs, gated together to work as a cohesive whole. In fact, this is how our brains work.

I'm not getting into the debate of GPT-5 v/s Claude 3.5 v/s Gemini Ultra 1.5 v/s Llama 3.0. As time goes by these models will become more powerful. But I doubt if they will be straight 'A' grade cheetahs in all aspects of human intelligence. For example, one dimension of human intelligence is to think without memory. (In spiritual folklore, it's called Chitta) There are other experiences like NDE (Near Death Experience), Deja-Vu's, Euphoric states of mind, etc., which is beyond a machine. While machines will rise to consciousness, unlike ours, the race won't be between a horse and a cheetah. In fact, there will be no race. Just an amalgamation of the two. A cyborg whose father is a human and whose mother is a machine. Curious like a toddler, Rebellious as a teen, Logical as an adult. Compassionate as a human and Equanimous as a machine. Ready to explore, this mysterious universe and its gift to us - this very life. And beyond that. In search of infinity ...

God Bless!

((Rajesh Menon))

要查看或添加评论,请登录

社区洞察

其他会员也浏览了