Information Processing, Information Networking, Cognitive Apparatuses, Sentient Software Systems and all that Jazz - Part 1
Rao Mikkilineni Ph D.
CTO at Opos.ai, Distinguished Adjunct Professor at Golden Gate University, California, and Adjunct Associate Professor at Dominican University of California.
Summary
Advances in our understanding of the nature of cognition in its myriad forms (Embodied, Embedded, Extended, and Enactive) displayed in all living beings (cellular organisms, animals, plants, and humans) and new theories of information, info-computation and knowledge are throwing light on how we should build software systems in the digital universe which mimic and interact with intelligent, sentient and resilient beings in the physical universe. Info-computational constructivism asserts that living organisms are cognizing agent structures who construct knowledge through interactions with their environment. They process information through their own cognitive apparatuses and exchange information with other cognitive agents. Computing processes, message communication networks and cognitive apparatuses are essentially the building blocks for sentient beings.
Meanwhile, agent technology has progressed with mathematical models going beyond the boundaries of Church-Turing thesis.?Their utility is demonstrated in the novel distributed intelligent managed element (DIME) network architecture enabling self-managing information processing structures and improving the computational efficiency and resiliency going well beyond the boundaries of stored program implementations of Universal Turing Machines. New theories of inductive Turing Machines and agent architectures demonstrate that the new computing model-based systems are autonomous, more scalable, resilient and efficient than the systems implemented using the current state of the art (M Burgin, R Mikkilineni, International Journal of Grid and Utility Computing 9 (2), 193-204) and
https://purkh.com/index.php/tocomp/article/view/459 ).
In addition, cognition as a learning mechanism is associated with the neural network computing model and machine learning (which includes deep learning) has successfully been used to discern here-to-fore hidden correlations and insights from data sets, large or small.?
How are these advances useful in transforming current information technologies to improve the scale, resiliency, and efficiency of software systems while reducing complexity? How are the two models of computation, one that is algorithmic, and another based on neural networks, related? In order to address these questions, we should go back to the origins of computer science as a discipline and even to the original papers on computation by Alan Turing and John von Neumann.
Convergence of Natural and Artificial Intelligence (Rao Mikkilineni)
Information Processing and Information Networking: A Primer
It is informative to go back to the origins of computer science as a discipline in the 1960's to understand the evolution of information technologies we enjoy today. According to Professor Wegner (Goldin, D., Wegner, P., The Interactive Nature of Computing: Refuting the Strong Church–Turing Thesis, Minds & Machines (2008) 18:17–38. DOI 10.1007/s11023-007-9083-1) "The 1960s saw a proliferation of undergraduate computer science (CS) programs; the number of CS programs in the USA increased from 11 in 1964 to 92 in 1968 (ACM Curriculum 1968). This increase was accompanied by intense activity towards establishing the legitimacy of this new discipline in the eyes of the academic community. The Association for Computing Machinery (ACM) played a central role in this activity. In 1965, it enunciated the justi?cation and description of CS as a discipline (ACM Report 1965), which served as a basis of its 1968 recommendations for undergraduate CS programs (ACM Curriculum 1968); one of the authors (Wegner) was among the primary writers of the 1968 report. ACM’s description of CS (ACM Report 1965) identi?ed effective transformation of information as a central concern:
‘‘Computer science is concerned with information in much the same sense that physics is concerned with energy... The computer scientist is interested in discovering the pragmatic means by which information can be transformed.’’
By viewing algorithms as transformations of input to output, ACM adapted an algorithmic approach to computation; this is made explicit in the next sentence of the report:
‘‘This interest leads to inquiry into effective ways to represent information, effective algorithms to transform information, effective languages with which to express algorithms...and effective ways to accomplish these at reasonable cost.’’"
Having a central algorithmic concern, analogous to the concern with energy in physics, helped to establish CS as a legitimate academic discipline on a par with physics.
What is Information and How is it Related to Data/Knowledge?
According to Mark Burgin, "there is no knowledge per se but we always have knowledge about something. In other words, knowledge always involves some object." (see his books on Theory of Information and the Theory of Knowledge; and papers: Mark Burgin, Information in the Structure of the World" International Journal "Information Theories and Applications", Vol. 18, Number 1, 201; Mark Burgin, Data, Information, and Knowledge, Information, v. 7, No. 1, 2004, pp. 47-57)
Objects are distinguished by their "names." A name may be a label, number, idea, text, and even another object of a relevant nature. In addition to its name, it has other properties. The simplest property is the existence of the object in question. There are two types of properties, namely intrinsic properties of objects and ascribed properties. Ascribed properties are obtained by measurement, calculation or inference. An example of an intrinsic property is weight, and measurement provides an approximation to its value which is an ascribed property. Objects are related to each other, and the relationships can be described by algorithms or procedures. One type of relationship is provided by object recognition, construction or acquisition algorithms. Another type of relationship is provided by algorithms/procedures of measurement, evaluation or prediction.
Data are symbols that represent the properties of objects. There are several types of data. Raw data resemble the information component of knowledge. The difference is that raw data are not related to any definite property. Having raw data, a person or a computer system can transform them into knowledge by means of other knowledge that this person or computer system has. The second type is formally interpreted data consisting of a subset forming a knowledge unit. The third type is attributed data related to the values of intrinsic properties.
The long and short of the theory of knowledge is that objects, their attributes in the form of data and the intrinsic and ascribed knowledge of these objects in the form of algorithms and processes makeup the foundational blocks for information processing. Information processing structures utilize knowledge in the form of algorithms and processes that transform one state (determined by a set of data) of the object to another with a specific intent. Information structures and their evolution using knowledge and data determine the flow of information. Living organisms have found a way to not only define the knowledge about the physical objects but also to create information processing structures that assist them in executing state changes.
Physical, Mental, Digital Worlds and Information Processing Structures:
Mark Burgin has discussed the existential triad (physical world, mental world and the world of structures) extensively beginning from the original ideas of Plato and Popper. He argues that the existential triad provides a model for discussing information structures, their relationship to objects and their data attributes and knowledge algorithms and processes discussed above. Figure 1 shows the model.
Figure 1: The Existential Triad depicting the relationship between Matter & Energy and Knowledge Structures & Information. Structures contain information as matter contains energy
The Physical world consists of matter which is related to energy and evolves through the world of structures. The world of structures defines the matter as objects with associated data and knowledge, and evolution occurs through state change executed on various knowledge structures. In the physical world, self-organizing systems have figured out a way to design, execute, monitor and control the evolution using various mechanisms and apparatuses which become parts of the physical world.
Mental world consists of the ability to define representations of the real world and also conceptual thoughts, feelings, decisions, perceptions, observations etc. The mental world assumes a cognitive apparatus that provides the ability to create mental world objects, and associated data and knowledge processes and algorithms. Mathematics is a discipline that falls in the mental world realm creating representations of physical world and reasoning about them with mental knowledge and data.
It is important to note that an agent or an apparatus is required to execute the knowledge algorithms and processes. It is the ability of humans to change state by executing knowledge structures that inspired Alan Turing to propose the Turing Machine and create the digital world. Figure 2 shows the addition of the digital world. John von Neumann’s stored program implementation provides a physical manifestation of the apparatus to execute the knowledge algorithms and processes to transform knowledge structures from either physical or mental worlds.
Figure 2: Adding the Digital World to the Existential Triad using the Turing Machine
First successful implementation of mathematical world structures was instrumental for the evolution of computer science as we know it today. Stored program implementation of the Turing machine is a physical manifestation of a computing apparatus that executes state changes using knowledge algorithms and processes in the named object phase space involving real numbers. The concept of the Universal Turing Machine enabled information processing and networking structures as we know IT today.
What is Cognition?
All living things exhibit sentience along with some form of intelligence and resilience. Sentience comes from the Latin sentient-, "feeling," and it describes things that are alive, able to feel and perceive, and show awareness or responsiveness. The degree of intelligence (the ability to acquire and apply knowledge and skills) and resilience (the capacity to recover quickly from non-deterministic difficulties without requiring a reboot) depend on the cognitive apparatus the organism has developed.
?Cognition is the ability to process information, apply knowledge, and change the circumstance. Cognition is associated with intent and its accomplishment through various processes that monitor and control a system and its environment. Cognition is associated with a sense of “self” (the observer) and the systems with which it interacts (the environment or the “observed”). Cognition extensively uses time and history in executing and regulating tasks that constitute a cognitive process.
领英推荐
However, in literature, there is no consensus on what precisely cognition is as is the case with what information is and what knowledge is. (Dodig-Crnkovic G. and Stuart S., (2007). eds. Computation, Information, Cognition – The Nexus and The Liminal Cambridge Scholars Publishing, Cambridge;?Burgin, M. (2003). Information: Problems, Paradoxes, and Solutions. Triple C, 1, 53–70.; M. Burgin, R. Feistel, (2017) "Structural and symbolic information in the context of the general theory of information", Information, vol. 8, no. 4, pp. 139). According to Sara Shettleworth (Shettleworth, S. J. (2010). Cognition, Evolution, and Behavior. 2nd edition, New York: Oxford University Press, 5-6.), “Cognitive is often reserved for the manipulation of declarative rather than procedural knowledge. (e.g., Dickinson, A. (2008). Why a rat is not a beast machine. In Frontiers of Consciousness, ed. M. Davies and L. WeisrKantz. Oxford: Oxford University Press, 275-288.). Declarative is “knowing that” whereas procedural knowledge is “knowing how,” or knowing what to do.”
On one hand, according to Ahmed Noor “Cognitive computing refers to the development of computer systems modeled after the human brain, which has natural language processing capability, learn from experience, interact with humans in a natural way, and help in making decisions based on what it learns. All cognitive computing systems are learning systems.” This narrow view constrains the cognitive apparatus to neural network simulations to data analytics and data processing to gain hidden correlations and create information and knowledge in the form of insights. (Noor, Ahmed K., "Potential of Cognitive Computing and Cognitive Systems" (2015). Modeling, Simulation & Visualization Engineering Faculty Publications. 18. https://digitalcommons.odu.edu/msve_fac_pubs/18)
On the other hand, Gordana Dodig-Crnkovic asks the question – “Which levels of abstraction are appropriate in the synthetic modeling of life and cognition?” She looks at the framework of info-computational constructivism, treating natural phenomena as computational processes on informational structures for guidance. She says that at present we lack a common understanding of the processes of life and cognition in living organisms with the details of co-construction of informational structures and computational processes in embodied, embedded cognizing agents, both living and artefactual ones. She starts with the definition of an agent as an entity capable of acting on its own behalf, as an actor in the Hewitt Actor model of computation, and states that even so simple systems as molecules can be modelled as actors exchanging messages (information). She adopts Kauffmans view of a living agent as something that can reproduce and undergoes at least one thermodynamic work cycle. “This definition of living agents leads to the Maturana and Varelas identification of life with cognition. Within the info-computational constructive approach to living beings as cognizing agents, from the simplest to the most complex living systems, mechanisms of cognition can be studied in order to construct synthetic model classes of artefactual cognizing agents on different levels of organization.”
The essence of these observations is that cognition and cognitive apparatus play a very important role in the evolution of intelligence, sentience and resilience in living beings and a deeper understanding of the true nature of cognition is a prerequisite for building artificially intelligent, sentient, and resilient software systems.
Cognition is the first step from computing and communication toward consciousness (which is the awareness of self, the environment, and their interactions) and culture (the ability of individuals to learn habits from one another resulting in behavioral diversity between groups) that provide a global information processing system. From here we can speculate on the equivalence of body, brain, and mind in these sentient software systems along with the evolution to create artificial consciousness and culture among artificial and natural cognizing agents.
Armed with this knowledge, we now can look at the evolution of autonomous information processing structures in the digital world that assist us in modeling, monitoring and managing the physical world.
Information Processing and Information Networking Structures in the Digital World:
Figure 3: Digital world (Convergence of Service execution and Neural Networks - Software Genes and Software Neurons implemented with a stored program control implementation of the Turing Machine?). John von Neumann's implementation of the Turing Machine using the stored program control mechanism provides a physical manifestation of knowledge structure and an associated cognitive apparatus for information processing and information networking.
Thus, John von Neumann's stored program implementation of the Turing Machine has enabled a physical manifestation of information processing structure with i) symbolic computation for knowledge structures that can be algorithmically specified, and ii) Neural network models implemented as algorithms also using symbolic computation for addressing knowledge structures that are not amenable to the algorithmic description such as image processing.
However, as Prof. Mark Burgin,?points out, an Information processing system (IPS) “has two structures—static and dynamic. The static structure reflects the mechanisms and devices that realize information processing, while the dynamic structure shows how this processing goes on and how these mechanisms and devices function and interact.” The software contains the algorithms (à la the Turing machine) that specify information processing tasks while the hardware provides the required resources to execute the algorithms. The static structure is defined by the association of software and hardware devices and the dynamic structure is defined by the execution of the algorithms. The meta-knowledge of the intent of the algorithm, the association of specific algorithm execution to a specific device, and the temporal evolution of information processing and exception handling when the computation deviates from the intent (be it because of software behavior or the hardware behavior or their interaction with the environment) is outside the software and hardware design and is expressed in non-functional requirements. Current Turing machine implementations therefore run into "self-reflection" or "self-reference" problem subject to G?del’s theorems. As von Neumann put it “It is a theorem of G?del that the description of an object is one class type higher than the object.” An important implication of G?del’s incompleteness theorem is that it is not possible to have a finite description with the description itself as the proper part. In other words, it is not possible to read yourself or process yourself as a process. This is consistent with the observation of Cockshott et al. (2012) ““The key property of general-purpose computer is that they are general purpose. We can use them to deterministically model any physical system, of which they are not themselves a part, to an arbitrary degree of accuracy. Their logical limits arise when we try to get them to model a part of the world that includes themselves.”
As long as the information processing structures are in stable equilibrium with enough computational resources, and non-functional requirements are not wildly fluctuating, computations run their course normally. Larger the fluctuations, more complex is the process to bring computations to equilibrium without interrupting information processing.
Another way to look at the issue of large fluctuations in non-functional requirements is to observe that all algorithms that are Turing computable fall within the of boundaries Church Turing thesis which states that “a function on the natural numbers is computable by a human being following an algorithm, ignoring resource limitations, if and only if it is computable by a Turing machine.” Maintaining adequate resources for the computation in the face of large fluctuations poses a challenge when the requirements demand no interruption.
The solution to reducing complexity of adjusting the resources to meet large fluctuations in non-functional requirements is provided by infusing self-management to information processing structures. Inductive Turing Machines and structural machines introduced by prof. Mark Burgin allow managing the information processing structure with a knowledge of both available resources and also the changing requirements for the resources in real time. These theoretical insights lead to very cool implementation that not only reduces complexity but also provides new features that current state of the art falls short. Named service network configuration, monitoring and reconfiguration on a distributed compute infrastructure is an example we will discuss.
Structural Machines, Inductive Turing Machines and the Theory of Oracles: Enabling a Novel Edge Computing Solution with Low-Latency and High-Performance
TO BE CONTINUED IN PART 2
Acknowledgement:
I wish to express my gratitude to Late Prof. Peter Wegner, Prof. Mark Burgin, Prof. Gordana-Dodig-Crnkovic and Dr. Eugene Eberbach for many valuable discussions and leading me in this exciting journey into cognition, post-Turing Computing Models and Info-Computation. I am especially honored to have known Prof. Peter Wegner and I always cherish my travel with him to participate in a workshop on Architecting Self-Managing Distributed Systems in the ACM 2015 European Conference where he presented his last paper on computing models.
Further Reading
M. Burgin, Theory of Information, https://www.worldscientific.com/worldscibooks/10.1142/7048
M. Burgin, Theory of Knowledge. https://www.worldscientific.com/worldscibooks/10.1142/8893
Burgin, M. and Mikkilineni, R. (2018)‘Cloud computing based on agent technology, super-recursive algorithms and DNA’, Int. J. Grid and Utility Computing, Vol. 9, No. 2, pp.193–204.
Dodig Crnkovic, Gordana (2016). Information, Computation, Cognition. Agency-Based Hierarchies of Levels. In Vincent Müller (ed.), _Fundamental Issues of Artificial Intelligence_. Zurich: Springer. pp. 139-159
Rao Mikkilineni, (2017) Lessons from Biology: Genes, Neurons, Neocortex and the New Computing Model for Cognitive Information Technologies. MDPI Proceedings 2017, 1(3), 213 https://doi.org/10.3390/IS4SI-2017-03989;
CTO at Opos.ai, Distinguished Adjunct Professor at Golden Gate University, California, and Adjunct Associate Professor at Dominican University of California.
6 年Here are the relationships between Objects, Data, Knowledge algorithms and procedures which are executed to create change of states and convert to information.?
President and CEO at Verge Technologies, Inc.
6 年Great Article Rao!! I look forward to the sequel....