System theories
"Well-done! Very clean and essentialized." "Accurate and easy-to-follow... highly recommended."
When is an entity not well-called a system? When we have applied no system theory to that entity; we have no description or model of it as a system, or no prospect of completing one. So, to speak of the entity as a system leaves the listener none the wiser.
The ISO 42010 standard on the architectural description of systems takes no position on what a system is. "Users of the Standard are free to employ whatever system theory they choose."
This article presents an overview of system theories. It outlines several ways to describe an entity, object or enterprise as a system. It is a mistake to assume one is the true way, or to think that different views can readily be integrated. Different schools apply different system theories to different kinds of scenario. However, there are considerable overlaps between system theories.
The structure of this article
This article is divided into four parts.
Part one: systems in general
This part introduces some common system theories and principles.
General system theory (after Ludwig von Bertalanffy) includes concepts found in many different sciences. Today, most systems thinkers are followers of one or more schools below.
There are a variety of more particular approaches to investigate or define how a dynamic system behaves. Below, in associating authors' names with different approaches I do not mean to imply any named person is the main or only authority.
Part two: closed system methods
People model a closed system to understand how outcomes emerge (social, economic, biological or other) emerge from a set of interacting parts.
Cybernetics (after Weiner, Ashby and others) is about how state variables interact in regular patterns of behavior to advance the macro state of a system. The two varieties of systems thinking below may be classified as varieties of cybernetics
System Dynamics (after Forrester, Meadows and others) is about how state variables interact by increasing and decreasing each other’s values in causal loops. A diagram can represent the loops, but to define the system fully, the mathematics of the interactions must be added.
For example: consider using System Dynamics to model a predator-prey system.
The model is a purely mathematical abstraction. It is used to understand how problems arise and/or forecast possible outcomes. We can animate the model to watch how, from an initial state, the populations grow and shrink. In running the model, an event is a time unit, say a week, in which a quantity of birth and death events occur.
Part three: open system methods
People model an open system to understand how outcomes emerge from a system in which inputs or events drive an activity system to respond by changing state and/or producing outputs.
Soft systems thinking approaches model human activity systems that transform inputs into outputs required by customers or other external entities to meet their goals.
Event-driven activity system thinking (employed in enterprise and software architecture methods) is about how actors or components interact, in roles governed by rules, to advance the state of an open system and/or produce outputs from inputs.
People build entity event models to model information systems. Given the goal to monitor the lives of individual wolves and sheep we can model the predator-prey system above
The information system receives an input message for each birth and death event, and maintains records of individual wolves and sheep in a database.
Part four: other approaches
This part touches on game theory (a closed system approach), second order cybernetics, and classification systems, which are passive rather than dynamic.
PART ONE: Systems in general
This part introduces some common system theories and principles.
Ideas that prefigured modern “system science” emerged centuries ago. For example:
Isaac Newton (1642-1726) described the universe as a collection of objects interacting through forces, governed by the laws of motion.
Adam Smith (1723-1790) described a nation's economy as a collection of interacting businesses (autonomous agents), governed by the law of supply and demand.
Charles Darwin (1809-1882) wrote on the evolution of a species as collection of individuals who interact to reproduce modified versions of themselves.
Claude Bernard (1813-1878) wrote on how an organism maintains it state by homeostatic feedback loops. Other 19th century biologists described other organic systems.
One might glibly say a system is collection of related elements, but that definition is so generalized it is of little help to people teaching and applying particular system theories.
The systems of interest here are dynamic. They change state over time, as a hurricane or football match does.
The systems that systems thinkers observe and envisage today did not exist at the start of the universe, and will not exist at the end. They are transient islands of describably organized behavior in the ever-unfolding process of the universe.
Natural systems emerge from aim-less evolution. By contrast, designed systems emerge from the purposeful intent of humans to regulate the behavior of some thing(s) to produce some desired effect(s).
In short, a system is a transient island of orderly or regular behavior, in which two or more entities interact, carved by a system describer out of the infinite complexity and ever-unfolding process of the universe.
Looking for systems in the world
We should distinguish abstract systems (models or descriptions) not only from real systems (manifestations of models or descriptions) but also from entities (actors or agents) that act to manifest real systems.
There are systems everywhere we look. A poker game is a human activity system with win/lose outcomes. It features roles played by active entities (a dealer and some players) which change the state of passive entities (the deck of cards, and each hand, with its current cards and bet placed so far).
Consider Tindbergen's stickleback mating ritual as an abstract system. It features roles played by two active entities (two sticklebacks) with reference the state of each other, and a passive entity (the nest).
The whole ritual is a (progressive rather than cyclical) process. It is a system of interacting parts. The macro state of the ritual is decomposable into the micro state of each entity involved.
As a physical entity, a stickleback has countless describable variables and ways of behaving (aka abilities). But few are relevant to the systems thinker interested in the roles that sticklebacks play in the ritual. Each role is defined by how a stickleback reacts, in a regular and repeatable way, to detecting information about the states of other entities. (A role corresponds to what Ashby called a "machine with input").
We might see each stickleback's role as a system of interacting parts, and/or see each activity a stickleback performs as a system of interacting parts. But in a system of interest, we don't look inside what we regard as atomic entities.
Before looking at particular system theories, let me briefly define some concepts generally applicable to the description of systems.
System boundary concepts
The granularity of a system, and its parts, is entirely in the gift of the describer.
Encapsulation: the enclosure of things inside a boundary. Hiding the internal structures and behaviors of a system inside a "black box".
In examples discussed by systems thinkers, the boundary is often physical. For example, the skin of an animal, the membrane of a cell, the wall of shipyard, or the user interface of a software system. But in general, the boundary of a system is logical, and drawn by an observer wherever they choose.
Environment: the wider situation or context in which a thing exists. The state of things (outside its boundary) that it interacts with - sometimes called external entities.
System boundaries can be nested and overlapping. In the graphic below, the wider system encapsulates three parts or subsystems, and interacts with two external entities outside its boundary. Notice how parts inside that wider system become external entities in the environment, situation or context of the narrower system below.
Generally, in defining a system of interest, a systems thinker does not look inside external entities or atomic parts. And therefore, any attempt to measure the "complexity" of the system excludes the internal complexity of its atomic parts.
System: an encapsulation, by an observer or describer, of related structures and processes. A set of elements, related to a given interest, that interact in a pattern of behavior to change or advance the system's state, and produce effects one element cannot produce on its own.
Most business systems thinking is about open systems that consume and produce materials and/or information, which are often associated in human endeavours.
Input: a message or thing detected by a system, typically causing an internal event.
Output: a message or thing produced by a system, typically causing an external event.
Outcome: an effect or result that output leads to.?
Feedback: the way that outputs of an encapsulated system influence its future inputs.
Emergence: the appearance of an effect from an interaction between things. OR the emergence of a property in a thing from the mutation or evolution of its parts.
PART TWO: closed system methods
People model a closed system to understand how outcomes emerge (social, economic, biological or other) emerge from a set of interacting parts.
Cybernetics
The first cyberneticians included Norbert Wiener (1894-1964), W. Ross Ashby (1903-1972) and Alan Turing (1912-1954).
When Norbert Weiner coined the term "cybernetics”, ?he was paying homage to James Maxwell's 1868 paper on “governors”. The word derives via Latin from the Greek Kubernetes, meaning "steersman".
Cybernetics is about systems in which a regulator (observer and controller) subsystem monitors and directs a regulated (or target) subsystem. And how the state and behavior of an animal or machine can be monitored and steered by communicating and storing information.
In the mid 20th century, cybernetics was discussed and promoted by two influential groups.
1941-1960: The Macy Conferences were cross-disciplinary meetings in New York, with a mandate to aid medical research. Topics included connective tissues, metabolism, the blood, the liver, renal function, infancy, childhood, aging, nerve impulses, and consciousness.
1949-1958: The Ratio club was a cross-disciplinary group in the UK, founded by the neurologist John Bates to discuss cybernetics. Members included psychologists, neurobiologists, engineers, physicists, and mathematicians. Many went on to become prominent scientists.
W. Ross Ashby was a psychologist and a prominent member of the Ratio Club, who popularized using the term “cybernetics“ with reference to self-regulating systems. His work influenced many scientists.
Ashby said that cybernetics depends in no essential way on the laws of physics or on the properties of matter. It deals with all forms of regular, or determinate, or reproducible behavior. It is concerned with those aspects of systems that are determinate - follow regular and reproducible courses.
Ashby told us a dynamic system is a way of behaving, not a material entity. It is the behavior we study, not the material substance.
Ashby's real world system changes state over time, either continuously (cf. analogue) or in discrete steps (cf. digital). However, in his abstract systems, change occurs by a measurable jump from one state to another (or perhaps one way of behaving to another). To model a continuous system, the passage of time is represented by discrete time events that are so short that they appear continuous, and lead to well-nigh the same outcomes.
In teaching, Ashby's initial focus was on closed systems that advance under their own internal drive. The essential elements of the system are a set of state variables, and a set of transformations that determine how variables advance from one given value to the next.
However, even when modeling a closed system, Ashby assumed that
The trajectory of quantitative variable values over time can be projected on a graph on which he x axis shows time and the y axis showing variables values going up and down. Consider a pendulum for example.
Notice that given a system with two or three state variables, its line of behavior may be represented as a two or three-dimensional geometric shape.
Abstracting systems from an entities
We often speak of a physically bounded material entity and a real system as if they were the same thing.?For example, we see a clock as an entity or a system whose state advances under its own internal drive. And we see an organism as an entity or a system that is stimulated to act in response to input events.
However, Ashby's system is independent of any physical form, or any medium for memory or messages. It is abstracted from the infinite complexity of any physical entity or situation in which the system may be observed.
At first, I read Ashby as distinguishing an abstract system on paper from a real system - a physical organism or machine. Eventually, I concluded that he was writing of three things, which I distinguish as follows.
An abstract system - a description, a set of variables selected by an observer as being of interest to them, and transformations that explain how variables change value over time.
A real system - a manifestation of an abstract system by one or more entities.
An entity - an entity or collection of entities, such as an organism or a card school, in which a real system can be observed and described.
The triad below is a graphical way of saying that systems thinkers create and use abstract systems to represent (generalize, classify, typify) real systems that they observe or envisage in real-world entities.
In other words, an abstract system is a type, which can be manifested in any number (0, 1 or more) of real system instances.
Ashby's famous theorem and law
A regulator (observer and controller) must know the current state of whatever it regulates. For example, a thermostat must know the current temperature of the air in a room regulates, and an animal's nervous system knows the current salinity of its body.
领英推荐
Ashby is known for two principles followed by the regulator of an entity whose state is represented in a model by state variable values.
The good regulator theorem: "every regulator of a system must be a model of that system". Or more accurately, every good regulator must contain or have access to a model of the system it regulates.
Given the idea that regulators create and use models to represent the state of things they monitor and direct, Ashby suggested the brain evolved to help animals monitor and direct real-world phenomena.
The law of requisite variety: the model of an entity's state must have sufficient variety to enable a regulator to monitor the performance of the entity and steer it in the right direction.
Ashby's view of change
An adaptation is a change that helps an entity survive or thrive. What kinds of change are possible?
In "Design for a Brain" 5/7, Ashby wrote of an animal shivering, that “adaptation is commonly used in two senses,
In "Introduction to Cybernetics" 4/2, Ashby wrote “the word “change” if applied to such a machine can refer to two very different things.
Consider biological, social and software entities that recognize and react to several input types, in whatever order they arrive. The entity reacts to each input type by applying a subset of its rules to a subset of its state variables. ?I can regard these entities as exhibiting the kind of change classified above as a switch.
Would Ashby say that entity is one machine, manifesting one system? Or say the entity is one machine, manifesting a different system for each input type? Or say the entity acts as different machine/system for each input type?
My view of change
State change: a change to a variable value (such as cooler or hotter), which may happen under a system's internal drive, or else in response to an input.
Switch: a change from one predefined "way of behaving" to another, which happens in response to an input.
Evolution or mutation: a change that adds or removes variables or behavior types, whether organically or by design.
Arguably, a switch or mutation changes the “system” that is realized by the entity.
Adaptation: any kind of change above that helps a thing survive in a changing environment, or meet some aims. Note that adaptation by homeostatic state change differs from adaptation by evolution or mutation.
System Dynamics
System Dynamics is a well known variant of cybernetics. It is named here with capitals because it is the name for a particular way of modeling a system. Notable figures here include:
Jay Forrester (1918 to 2016) developed the approach.
Donella Meadows (1941 to 2001) popularized the approach with reference to human social activity.
Peter Senge (1947) developed a range of archetypical models, each representing a common behavior pattern in human society.
People build System Dynamics models to understand how outcomes emerge from a closed system of quantitative variables (social, economic, biological or other) that interact and increase and decrease over time.
The interacting elements are quantities (stocks) that interact in cause-effect relationships (flows). The pattern in how they interact can be represented in a causal loop diagram like the one below, which shows how every change in one stock changes another stock, in the same or opposite direction.
Method
Consider using System Dynamics to model a predator-prey system. The stocks represent wolf and sheep population sizes. The inter-stock flows represent birth and death rates that affect the population sizes
To oversimplify, we might proceed along these lines.
The model is a purely mathematical abstraction. It is used to understand how problems arise and/or forecast possible outcomes. We can animate the model to watch how, from an initial state, the populations grow and shrink. In running the model, an event is a time unit, say a week, in which a quantity of birth and death events occur.
In practice issues
People often draw a CLD to tell a story, or present a political position. And (as Donnella Meadows pointed out) they rarely verify the CLD by completing a System Dynamics model, quantifying how the system behaves over time, and comparing that with reality.
Challenges include how to
Other issues include the impact of
Moreover, chaos theory tells us outcomes are unpredictable, because tiny variations in the initial state can lead to dramatically different outcomes in the long term.
Having said all that, let me point to some practical applications https://systemdynamics.org/resources/successful-applications/
Social system archetypes
Peter Senge and others have described several social system archetypes . Each is representable as CLD that relates a few (ten or so) state variables by increase/decrease functions. Each is a problematic behavior, which should be addressed or stopped by managers and/or others who make "interventions" to solve such problems.
One of the archetypes is called "The tragedy of the commons". On the lookout for this archetype, we might observe that a fishing community (a social entity) behaves as an instance (a real system) that manifests the variables and rules of that archetype (an abstract sytem).
Note: until the fishing community is described as manifesting the features of a given system type, to call it a system has no particular meaning. And if the fishing community manifests several different system archetypes, then to call it a system is ambiguous - until we say which system we are talking about.
Whereas Senge's system archetypes represent problems to be solved, an event-driven activity system may be designed to solve problems and meet requirements.
I suppose problems revealed by the System Dynamics model could appear in a "request for work" for the information system, but I have never seen such a case.
PART THREE: open system methods
People model an open system to analyse or design how outcomes emerge from a system in which inputs or events drive an activity system to respond by changing state and/or producing outputs.
Soft systems thinking
Churchland, Checkland and Ackoff are sometimes referred to as soft systems thinkers. Their interest was the application of system theory to human organizations, businesses or institutions. They address human activity systems that transform inputs into outputs required by customers or other external entities to meet their goals.
Checkland's soft systems thinking method starts from the idea that observers with different perspectives (or world views) may abstract different systems from the same business. This ideas is not unique to soft systems thinking. It might be said that all systems are soft systems,
From cybernetics to entity-event modeling
How to define a deterministic system? Ashby set out a three step method for cybernetics.
Structured systems analysis and design methods for event-driven information system design follow much the same course.
Methods developed in the 1980s include SSADM in the UK, and MERODE in France. Both contained entity event modeling methods that very loosely correspond to Ashby's method.
For more details of how such a method works, read the article on entity event modeling.
Event-driven activity systems
People build event-driven activity system models to model an open activity system of actors or components who interact with each other to meet some goals.
The external events that drive an open system may be listed in an interface or role definition that defines what the system has the capability to do (the operations, methods or transactions it can perform).
An event-driven system incrementally changes state in response to events. Its internal elements interact in rule-bound ways to advance the state of the system, and may act to consume inputs and produce outputs to meet defined aims.
Software systems are a special kind of activity system. They typically maintain records of entities in memory, and respond to input events. They use digital twins to represent entities and events in the world that they simulate, monitor or direct.
Busines activity systems depend on human and computer actors interacting in regular activities to create and use data that represents entities and events (customers, products, orders etc.) the business wants to monitor and direct.
The concept graph below is an informal representation of the systems of interest to enterprise architects.
The activities in such a system may modeled as processes (operations, methods or transactions) that are
The activities requestable by an observer can be specified in an interface or role definition that publicizes what the system has the capability to do as discrete “services”.
In-practice issues
Designing human and computer activity systems is difficult for many political, social and technical reasons. Much human activity is ad hoc or creative rather than systemic. People bypass rules given to them and perform actions that nobody could predict or model - as will be discussed in related articles.
PART FOUR: Other approaches
This part touches on game theory (a closed system approach), second order cybernetics, and classification systems, which are passive rather than dynamic.
Game theory
When actors interact, they may cooperate, as in a football team, or compete, as in a game of chess or a market place, or hurt each other, as in a boxing match or a war.
Axelrod and others developed game theory to understand and explain cooperation, conflict in society, and offer guidance on conflict resolution. Watch this video on how game theory explains the evolution of cooperation in animal societies.
Axelrod’s student, Rapoport was recognized for his contribution to world peace through nuclear conflict restraint. He also developed social network analysis. He showed how to measure flows through large networks, study material and information distribution through a system or society, and what speeds or impedes these flows.
Second order cybernetics
In the 1970s, Heinz von Foerster coined the term “second order cybernetics”.
If von Foerster's concern was to address the case where an observer of a system also acts in that system, then Ashby’s cybernetics handles that by distinguishing roles from the actors who plays them. The motivation and management of actors, the separation of duties in distinct roles, and other topics discussed under the heading of second order cybernetics might better be called organization theory than systems thinking.
Classification systems
Most systems of interest to systems thinkers are dynamic, meaning they change state over time. Finally, I want to mention a kind of system that is passive or inactive, like the librarian's Dewey Decimal System, the chemist's periodic table, or the biologist's classification of species.
To describe a thing to others, using a verbal message, we must either liken it to something else (draw a simile, such as Mars is like the Earth) or classify it (as in Mars is a “red” “planet”).
As A J Ayer wrote in “Language truth and logic”
To put it another way, as soon as you have described one thing, you can envisage other things of the same type. For example, having described one unicorn, you can envisage others of the same type. And having described one universe, some physicists envision parallel universes.
To avoid ambiguity in discussing one domain of knowledge, we use a controlled vocabulary and relate types in a knowledge structure.
Classification hierarchy: a multi-level structure or taxonomy that successively divides a supertype into successively more specific subtypes, down to the elementary types needed to differentiate things of interest.
Ontology: a network structure that relates types; each type is related to one or more other types in predicate statements of the kind: subjects <are related to> objects.
Inevitably, when a discussion spans more than one domain or system of interest, ambiguities arise, and we have difficulties with "semantic interoperation".
Taxonomical hierarchies or an ontological networks are discussed in a later article.
Remarks and related articles
These system theories do integrate some concepts used different domains of knowledge, such as biology, sociology and engineering. However, it is important to recognise what does not translate well from one domains of knowledge another.
This article is included with 12 others in “A Systems Thinker’s View of Enterprise Architecture”, a book made available to community-tier members of Cyb3rSyn Labs. You can subscribe here: www.cyb3rsyn.com.
t
1 个月Graham, Your paper contains several misunderstandings on systems concepts and ideas including misunderstanding about the different systems approaches. For example, systems dynamics despite the name is not about systems. Also a " system of interest" can be at any " atomic level"
Director and Principal Tutor, Avancier Limited
2 个月21 Jan 2025: The article has been substantially refined and clarified.
Clinical Psychologist
6 个月"?Or is it that the latter (including the rules for how to resolve conflcts between rules, and deciding when to break a rule) are so incredibly complex, and the responses to a stimulus are so unpredictable, that we have no hope of modelling them?" But we do model them all the time and largely unconsciously in our daily lives. Natural language is the key. The very lack of precision of the term 'culture' allows it to be used in different applications. It is a question of efficiency. The generality of language smooths out, if you will, all of the infinite detail. It'sabout coding.
Clinical Psychologist
6 个月"but its design (its DNA) does not evolve." but it does, and has. I'm thinking of the RNA world hypothesis here, but also just general evolutionary considerations. Unless you believe in Creatinism, DNA and all other things biological have evolved. ?
Clinical Psychologist
6 个月"The games that Wittengstein spoke of (including but not only ones like solitaire, poker, archery, and baseball) were ones in which there is a winning position to be reached by following the rules of the game.?" Not true. I've provided examples of non-competative 'games' Wittgenstein has cited.