There is No Such Thing as Artificial Intelligence
Intelligence is defined as “the ability to learn, understand, and make judgments or have opinions that are based on reason” [1]. We normally attribute this quality to humans (and sometimes to animals). Therefore, instinctively, when this ability is applied to a computer, we call it ‘artificial’ and define it as an intelligence that is “designed by humans and demonstrated by machines” [2]. One may believe that referring to a computer’s intelligence as Artificial Intelligence (AI) places humans in a position of control, where they provide guidance, education and discipline. However, once an algorithm is independently evolving and acquiring knowledge, skills, and capabilities, is it still under the auspices of humans? The answer is no; the machine forms its own intelligence and is no longer under the control of the human race. The algorithm capable of forming the intelligence may be man-made, but the intelligence this algorithm develops is not!
Therefore, instead of describing computer intelligence as artificial, we should describe it as Digital Intelligence (DI)*, as it processes binary data that are expressed as a series of 0’s and 1’s to reach conclusions and acquire skills. Both the human brain and computer algorithms assess present information, compare it with past experience, and make actionable decisions based on future predictions. However, core differences exist between human intelligence and Digital Intelligence, including the pace of data transfer and processing, as well as memory storage capacity. Assuming that human intelligence is formed and controlled by the brain and nervous system, its processing rate, learning, and memory capabilities are limited by the pace of neuronal communication, the number of neurons in the nervous system, and the number of synapses that they can form. These factors are limited by the organs forming the brain and nervous system, which were developed over decades of evolution and natural selection. Limitations conferred on computers, on the other hand, are derived from technological advancements, which progress at a rapid pace. In its simplest form, consider how much progress has been made in less than 70 years since the launch of the compact electronic calculator in 1957 [3] or in 85 years since the first digital computer prototype was introduced in 1939 [4]. In this relatively short time, we have progressed to using smartphones and laptops as indispensable commodities for day-to-day living. Indeed, technological advancements occur much faster than biological changes, and this underlies the ability of computers to develop intelligence much more quickly than humans.
There are several reasons why computers can advance more quickly than humans when developing intelligence and acquiring knowledge and skills:
1)????? Data propagation rate: Neural networks, which are formed in our brains, underlie our intelligence and make us who we are, with neuroimaging studies generally supporting a frontoparietal network relevant for intelligence [5]. Neural networks are also computational processes which imitate the human brain by processing multiple data types and creating patterns for use in decision-making, with convolutional neural networks being based on deep learning algorithms that have several hidden layers to analyze data [6]. However, while computers process data through electronic transfer, information processing and transmission in the nervous system is mediated by the exchange of positive and negative ions through ion channels. In contrast to computer hardware, where the rate of electronic transfer can be controlled and improved through technological improvements, the brain mechanisms that propagate data rarely change, and therefore, the rate of data processing stays the same.
2)????? Ability to adapt: The brain as a biological organ needs decades of genetic alterations to adapt and enhance its capacity to acquire intelligence. This is a major limitation of the human brain. In contrast, computer hardware can adapt, improve, and increase capacity at the same pace as new technological advancements.
3)????? Learning and memory capabilities: While the brain needs biochemical changes [7] and sleep for new learning and for memory stabilization and integration, particularly for encoding [8], computer algorithms can learn continuously, 24 hours a day, 7 days a week, and can store the data on internal or external drives.
4)????? Knowledge transfer: The brain as a biological organ loses all its learning at the end of life, and for another brain to acquire the same knowledge, it will need to learn and train its neural networks from the start. This of course is not the case with computer algorithms. Once a new computational model is generated, it can be very easily transferred to other computers or machines.
Since at this time in history computers already form intelligence of their own at a much faster rate than humans, the time to ask moral questions is now. What limits Digital Intelligence? What stops the better, faster, more intelligent computer from making momentous, and even devastating, decisions for humanity? Rules and values should be implemented to keep fast-developing Digital Intelligence under the control of the human race, and to ensure its use for the advancement of civilization, rather than for its destruction. The good news is that this is the topic of growing discussion, which is being led by prominent figures such as Space-X and Tesla founder and CEO, Elon Musk . As part of this discussion, we need to keep in mind that in addition to implementing new rules and guidelines, we also need to better implement and adjust existing rules and guidelines that have been learned for centuries by the human race. In my next article, I will expand on how we can draw lessons from the history of human civilization to guide this implementation, and how humanity can direct Digital Intelligence towards the advancement, rather than the destruction, of civilization.
In conclusion, there is no such thing as Artificial Intelligence, but rather a Digital Intelligence which differs from human intelligence by the mechanisms underlying its data propagation, processing, and storage. Accordingly, updated rules and guidelines are needed to address this technology. Therefore, going forward, we at Taliaz will change our terminology from Artificial Intelligence to Digital Intelligence, and we encourage others to do the same.
?
* The term Digital Intelligence was proposed by @Roy Schurr, my friend, colleague, and Taliaz’s VP Operations. I thank him for that.
领英推荐
References
[1]???????? Cambridge Dictionary. Intelligence. 2024. https://dictionary.cambridge.org/dictionary/english/intelligence?q=INTELLIGENCE (accessed May 19, 2024).
[2]???????? Tai MC-T. The impact of artificial intelligence on human society and bioethics. Tzu Chi Med J 2020;32:339–43. https://doi.org/10.4103/tcmj.tcmj_71_20 .
[3]???????? CASIO Education. 14-A: World’s first compact electronic calculator. 2024. https://www.casioeducation.com/primary-calculators/14-a (accessed May 29, 2024).
[4]???????? Freiberger PA, Swaine MR. Atanasoff-Berry Computer. Encyclopedia Britannica. 2023. https://www.britannica.com/technology/Atanasoff-Berry-Computer (accessed May 29, 2024).
[5]???????? Colom R, Karama S, Jung RE, Haier RJ. Human intelligence and brain networks. Dialogues Clin Neurosci 2010;12:489–501. https://doi.org/10.31887/DCNS.2010.12.4/rcolom .
[6]???????? Mintz Y, Brodie R. Introduction to artificial intelligence in medicine. Minim Invasive Ther Allied Technol 2019;28:73–81. https://doi.org/10.1080/13645706.2019.1575882 .
[7]???????? Stock JB, Zhang S. The biochemistry of memory. Curr Biol 2013;23:R741-745. https://doi.org/10.1016/j.cub.2013.08.011 .
[8]???????? Cousins JN, Fernández G. The impact of sleep deprivation on declarative memory. Prog Brain Res 2019;246:27–53. https://doi.org/10.1016/bs.pbr.2019.01.007 .
Senior SW Engineer, MCS, I`m not a robot [V], #opentowork
5 个月It seems to me that this is too naive and superficial a view of “Artificial” “Intelligence”. For your information, digital is not the only way, nor is it the most effective. I would still use the term "artificial", although of course "digital" sounds cooler.
Business Strategist and European Commission Grant Expert
5 个月Thank you for sharing ??