System theories

System theories

"Well-done! Very clean and essentialized." "Accurate and easy-to-follow... highly recommended."

The ISO 42010 standard on the architectural description of systems takes no position on what a system is. "Users of the Standard are free to employ whatever system theory they choose." It turns out there are several theories, and so several ways to describe an entity, object or enterprise as a system. It is a mistake to assume one is the true way, or to think that different views can readily be integrated.

This article discusses

  • closed systems in which interrelated state variable values interact and advance according to a set of rules, and
  • open systems in which interrelated actors or components respond to events and interact to produce effects or results of interest.

Contents: Classification systems. Dynamic systems. Cybernetics, System Dynamics, Event-driven activity systems. Scaling up to social systems thinking.

By the way, though this article is relevant to my day job, teaching classes in enterprise, solution and software architecture, and it might be of passing interest to enterprise and software architects, it won't help you get any certificate in those topics.

Systems in general

System: a collection of elements related in a pattern or by rules.

This definition is so generalized it is of little help to people teaching and applying particular system theories. Different schools of systems thinking bring different elements to the fore.

  • In the USA, many focus on the use of System Dynamics to model how inter-related variables (in social, economic and biological systems) increase and decrease over time, often in problematic ways.
  • INCOSE focus on the engineering of components to interoperate and produce desired effects or results (state changes or outputs).

Given that the "system of interest" is a pattern that relates elements, everything outside that is outside the system. Moreover, even elements inside its boundary are only parts of the system in so far as they are related in the system's pattern.

Further reading

One way to introduce the broad scope of systems thinking is to skim through the vocabulary used in discussion of systems, which you can find in this other article.

Before discussing the dynamic systems of interest to most systems thinkers, I want to mention a kind of system that is passive or inactive.

Classification systems

  • "Knowledge is a biological phenomenon" Humberto Maturana

In other words, description is a biological tool. The ability to describe reality evolved in organisms because they found that having or forming accurate models of the world is useful.

No material, physical, things were described until biological entities evolved bio-chemical ways to detect, remember and recognise them. The social brain hypothesis is that the evolution of human intelligence is closely tied to the evolution of the human brain and to the origin of language. Eventually, humankind evolved ways to describe things in words and graphical models, and we devised classification systems.

A classification system categorizes things as instance of types and relates those types in some kind of knowledge structure, including but not only taxonomical hierarchies and ontological networks.

  • "To describe something is to classify it" A J Ayer.

To put it another way, as soon as you have described one thing, you can envisage other things of the same type. For example, having described one unicorn, you can envisage others of the same type. And having described one universe, some physicists envision parallel universes.

To avoid ambiguity in discussing one domain of knowledge, we use a controlled vocabulary and relate types in a taxonomical hierarchy or an ontological network. Inevitably, when a discussion spans more than one domain or system of interest, ambiguities arise, and we have difficulties with "semantic interoperation".

Time as state change

Classification systems are passive structures. In practice, most of the systems of interest to systems thinkers are dynamic, meaning they change state over time.

A cosmologist's view

To measure time is to measure state changes in things we observe in space. If there was a big bang, there was no time before it, because time requires space. Things cannot move or change unless there is space from them to be in and change in.

The second law of thermodyamics (entropy increases) tells us to that to arrange things in a pattern (or in a system of interest) requires the input of energy. Ashby wrote that in the cybernetic view of systems (see below), a sufficient supply of energy is normally taken for granted.

Albert Einstein said time is an illusion that moves relative to an observer. The physicist Julian Barbour wrote a book on the illusion of time, saying change is real, but time is not; it is only a reflection of change.

A psychologist's view

State changes happened in space, over time, before and regardless of our observation of them. To pick up Enstein’s point - to observe state changes in the world (such as the hands of a clock moving) is to experience parallel state changes in the brain. As time flows forward, our brains consume energy to lay down memories.

Our psychological impression of how fast time passes is a different thing. In a period when changes happen in close succession, time appears to pass quickly. In a period when no change is happening, time appears to pass slowly. But paradoxically, in retrospect, the former period is remembered as substantial, and the latter is not remembered.

A quantum physicist's view

In 2022, a team of physicists published a paper suggesting quantum systems (or single particles) can move both forward and backward in time. Read Time Can Actually Flow Backward, Physicists Say. But if time were to flow backward at a higher level, we humans could not detect or descibe it, since our thinking would be reversed and our memories erased.

Dynamic systems - overview

A dynamic system is a pattern of behavior in which elements interact in rule-bound ways to advance the state of the system and produce effects one element cannot produce on its own.

Below, in associating names with different approaches to thinking about dynamic systems, I do not mean to imply the named person is the only authority.

In cybernetics (after Weiner, Ashby and others), state variables interact in rule-bound ways to advance the macro state of a system. Two varieties of cybernetics can be identified.

  • In Forrester’s System Dynamics quantitave state variables interact in a rule-bound way to advance the macro state of a closed system, by increasing and decreasing each other’s values. A causal loop diagram may be drawn to show an overview of the system, but to define the system fully, rules must be added in the form of the mathematics that govern the interactions.
  • In von Foerster’s second-order cybernetics, elements interact to advance the state of a system. However, the focus is on the observers of a system, and their interactions with it.

In event-driven activity systems, actors/components interact, in ways governed by rules, to advance the state of an open system and/or produce outputs.

After talking to different observers with diferent perspectives, users of Checkland’s method abstract different soft systems from a business (or other purposeful social entity), each being a human activity system that transforms inputs into required outputs.

Organic systems (Bertalanffy, Maturana)

Ludwig von Bertalanffy was a biologist who promoted the idea of a general system theory that includes the concepts of system state (which advances over time), inputs and outputs (from and to the wider environment), feedback from output to input, and the hierarchical composition of smaller parts into larger wholes.

Since the cells of an organism share the same DNA, interact within one body, and follow the rules of their roles to maintain and advance the state of the entity, that entity is reasonably viewed as a system. And the concept of hierarchical composition is readily seen in how biologists see the human body.

Smaller is different

If you successively decompose the material universe into ever smaller components of different kinds (from the universe to galaxies to solar systems, all the way down to atomic particles) you will several times cross the boundary from one domain of knowledge to another.

Similarly, if you successively decompose an organism into ever smaller components of different kinds (from organs to cells to organelles to bio-chemicals, all the way down to atomic particles) you will several times cross the boundary from one domain of knowledge to another.

There is no hope of integrating a discussion of coarse-grained components (say, organs of the human body) with a discussion of atomic particles. Even though the systems defined at each level might be seen as homomorphic - descriptions of the same structure - they are incompatible (a point to be picked up later).

See the discussion of Anderson's hierarchy in System theories and EA

Aside: When people want to understand or manage a large number of elements that are connected in a complex network, they often impose a hierarchy on it. Note that the successive decomposition of an organism into ever smaller parts of different kinds is one thing; and the successive decomposition of an organization into ever smaller elements of the same kind is another.

Emergence, complexity and consciousness

Consciousness may be explained as a side effect of biological evolution.

Emergence occurs in the simplest of systems, even in one with only two elements. For example, forward motion emerges from the interaction between a sail and a wind. After billions of years of evolution, an intelligent animal is now an extremely large and complex system, containing billions of interacting elements. The human brain is said to be most complex organ of all.

If consciousness is the ability to compare memories of the past with perceptions of the current, and envisagings of the future, and to position the self in all of them, then it seems reasonable to assume consciousness emerges from, and requires, a thinking machine of the extraordinary complexity we see in those few animal species that demonstrate forethought and self-awareness.

Does consciousness imply human decision making is not a deterministic process? Perhaps, but I see no reason to assume that. And how complex systems are built from simpler ones is a topic addressed in cybernetics.

Cybernetics (Weiner, Ashby)

In cybernetics (the science of steering how a system behaves) the elements of a system are state variables that interact in a pattern of behavior to advance the macro state of the system.

In my reading of Ashby’s books ("Design for a Brain" and "Introduction to Cybernetics"), his system is a set of interrelated variables, along with the rules that determine how variable values change, that are abstracted from observing the behavior of a real world entity.

Ashby referred to a physical entity (regardless of any observer, and with countless definable variables) as real machine. Be it mechanical, organic or social, I call that an entity.

Ashby referred to variables selected by an observer as of interest to them, as a system. He went on to include all the variables needed to render the whole system “regular”. I call a set of related state variables that advance in a rule-bound way, an abstract system.

Ashby sometimes to referred to the realization of selected variables by a physical entity as a machine without quotes. I call that a real system, and relate the concepts in this triad.

A systems thinking triad

We often speak of an entity and a real system as if they were the same thing.? A clock is an entity or system whose state advances under its own internal drive. An organism is an entity or system that is stimulated to act in response to input events, and is a manifestation of its DNA.

However, Ashby's system is an abstraction from whatever physical entity we observe it in.

Abstracting variables of interest from a physical entity

DfB 2/3. "The first step is to record the behaviours of the machine's individual parts. To do this we identify any number of suitable variables. A grandfather clock, for instance, might provide the following variables

  • the angular deviation of the pendulum from the vertical, the angular velocity with which the pendulum is moving, the angular position of a particular cog-wheel, the height of a driving weight, the reading of the minute-hand on the scale, the length of the pendulum."

DfB 3/11 "At this point we must be clear about how a “system” is to be defined. The real pendulum, for instance, has not only length and position; it has also

  • mass, temperature, electric conductivity, crystalline structure, chemical impurities, some radio-activity, velocity, reflecting power, tensile strength, a surface film of moisture, bacterial contamination, an optical absorption, elasticity, shape, specific gravity, and so on and on."

"Any suggestion that we should study “all” the facts is unrealistic, and actually the attempt is never made. What is try [true?] is that we should pick out and study the facts that are relevant to some main interest that is already given.”

DfB 7/1 "one ceases to think of the real physical object with its manifold properties, and selects that variable in which one happens to be interested."

In other words, we abstract an observed system from the physical entity in which it is observed. If one orchestra (a well-nigh infinitely complex thing) were to be replaced on a concert platform by another orchestra, as long as it follows the same symphony score, the system of interest to me will be unchanged.

Abstracting rules that govern changes to variable values

ItC 2.1 The basic terms and concepts of cybernetics include rule-bound state changes. Let me distil Ashby's basic terms and concepts.

  • State variable: an entity’s property, whose value can be changed.
  • Transition: a change from one state variable value to another.
  • Operator: a factor that causes or triggers a transition (be it a continuous force, or a discrete input, event or condition).
  • Operand: an entity’s state variable value prior to a transition.
  • Transform: an entity’s state variable value after a transition.
  • Transformation: a set of transitions caused by an operator acting on set of operands.

ItC 2/2 “a transformation defines/determines for each state variable value (each operand) in the systems of interest what the next value will be (its transform).”

Ashby’s transformation defines the rules that determine a state change. Akin to how a composer specifies how one note leads to the next on sheet of music.

System change

ItC 4/1 “the word “change” if applied to such a machine can refer to two very different things. There is.

  • the change from state to state, which is the machine’s behaviour, and which occurs under its own internal drive, and there is
  • the change from transformation to transformation, which is a change of its way of behaving, and which occurs at the whim of the experimenter or some other outside factor."

"The distinction is fundamental and must on no account be slighted.”

In my words, Ashby distinguished:

  • a state change: a transformation, which may occur under system’s internal drive, or be triggered by an input.
  • a mutation: a change to a transformation, a change from one way of behaving to another, which occurs in response to some external factor, or intervention.

For example, contrast the note-by-note state changes in the performance of a symphony with a change to a symphony score made by the composer. Or else, contrast progress in the life history of an animal, with the creation of a different animal via sexual reproduction.

As I see it, three kinds of mutation may be distinguished:

  • changes to values of invariants used in rules (such as a desired temperature),
  • changes to rules for state transitions (the flows in a CLD).
  • changes to state variables (the stocks in a CLD).

For some, all three are changes to a given system. For me, a mutation produces a new system. When the change is small, we call it new system version or generation. And when the mutation is sufficiently large, we rename the system.

In my view, not being clear about the distinctions above bedevils the use of the term "system" in social systems thinking discussion.

Representing system state changes on a graph

An abstract system (a description) generalizes (classifies, typifies) the elements and workings of a real system. In cybernetics, the essential elements are a set of state variables, and a set of rules that determine how variables advance from any given state to the next.

The trajectory of quantitative variables can be projected on a graph with the y axis showing variables values going up and down over time (as in System Dynamics).

Graphs showing the behavior of a pendulum (Wikipedia)

The trajectory might be cyclical or chaotic. The trajectory of 2 or 3 variables can be drawn as 2 or 3 dimensional shape, and this structure may be fractal (an avenue I have not explored).

Aside: Ashby uses the term behavior for a state change trajectory, some (instead or as well) use the term behavior for the rules of a system.

Ashby's famous theorem and law

A regulator must know the current state of any target entity that the system regulates - be it a machine or an animal. As a thermostat must know the current temperature of the air in room regulates, and a brain knows the current salinity of a body.

Ashby is known for two principles followed by the controller or regulator of a system - whose desirable state is represented by a set of state variable values.

  • The law of requisite variety: the model must have sufficient variety to enable the regulator to monitor the performance of the system and steer it in the right direction.

Note for enterprise and software architects: only the “requisite” variety is needed. Capturing more variable values than you need, in the hope they might help you in future, will likely turn your data lake into a septic lagoon.

  • The good regulator theorem: every regulator of a system must have access to a model of that system.

In a simple feedback loop, a regulator responds to feedback by adjusting the state of what it regulates. But note the earlier discussion of system change.

The general idea is that regulators <use> models to <represent the state of> things they <monitor and direct>. This can be represented in a triadic graphic.

The good regulator theorem

In “Design for a Brain” Ashby presented the brain as a regulator that monitors and maintains the state of the body, and seeks to change a pattern behavior when it is not working.

Second-order cybernetics is discussed in a later article. A more well known variant of cybernetics is System Dynamics, named here with capitals because it is the name given to it by its leading authority, Forrester.

System Dynamics (Forrester, Meadows)

The interacting elements of a system in System Dynamics are quantities (stocks) that interact in cause-effect relationships (flows). The pattern of behavior can be represented in a Causal Loop Diagram (CLD) like the one below, which shows how every change in one stock changes another stock, in the same or opposite direction.

A causal loop diagram

When sucb a model is animated, over time, the quantities of the stockes may increase or decrease in surprising or problematic ways.

People often draw a causal loop diagram to tell a story, or present a political position. But as Donnella Meadows pointed out, they rarely verify the model by completing a System Dynamics model, quantifying how the system behaves over time, and comparing that with reality. Challenges include how to

  • verify the truth of same and opposite labels,
  • quantify same and opposite effects,
  • account for how effects change as quantities approach zero, or an upper limit.
  • quantify initial values
  • account for the impact of chaos theory on predicting outcomes (in a deterministic yet chaotic system, in the long term, state changes are unpredictable).

Perhaps the biggest issue for a systems thinker wanting to use a model to explain or predict real-world behaviour is the impact on its usefulness of a) variables not included in the model and b) actors changing how they respond to events when conditions change.

Event-driven activity systems

The interacting elements in an event-driven activity system are actors/components that act in rule-bound ways to advance the state of the system and/or produce outputs. The system is

  • dynamic (changes state in response to input and/or time events)
  • open (consumes inputs and produces outputs).

Encapsulation

An open system is encapsulated within a wider environment, and interacts with it by consuming inputs and producing outputs.

Some systems, like the organs of a human body, are naturally and physically bounded. Other systems, like a business, are bounded only when observer has decided what to declare as inputs and outputs of interest. Since the boundary is purely logical, different observers may take different views of what counts as inputs and outputs, and draw different boundaries.

Inside an open system's boundary, actors/components interact to produce effects that one element cannot produce on its own. They interact in regular activities, using resources, to meet aims, by advancing the state of the system and transforming inputs into required outputs.

The atomic activities the system can perform may be defined as discrete events that are

  • triggered by discrete inputs or time events, and
  • produce outputs and/or change the internal state of the system.

The granularity of events

A discrete event is an atomic and indivisible process, definable by the states of the affected system (pre and post conditions) before and after the event, regardless of what happens in between. Several such discrete acts (aka operations, methods or transactions) may be listed in an interface or role definition that publicises what a system has the capability to do.

The granularity of a discrete event depends on the granularity of the system of interest. A large system can be decomposed into smaller subsystems that are coupled to each other. Similarly, a long process can be decomposed into shorter processes that are connected in sequence. When defining a business activity system, we may define both finer-grained events (such as room cleaning), and coarser-grained events (such as hotel room occupancy).

Simulating continuous systems

An event-driven activity system can simulate a continuous (or analogue) system by recording and reacting to events frequently enough to ensure any invariant condition of interest is always true. For example, if you look frequently enough at your child, and react swifly enough, you can ensure they never pick their nose. (Will Harwood tells me this is the domain of "sampling theory".)

Software and business activity systems

Both software and human activity systems can be defined as open systems - as black boxes that consume inputs and produce outputs. Where we choose to draw the boundary of the system is a choice we make. It does not always coincide with a physical boundary, like a wall or a skin.

To encapsulate an open system of this kind, to define its interface, we define the inputs, and for each input we define the rules (pre and post conditions) that govern changes to state variable values inside the system, and any output that is produced in response to the input.

Software activity systems

Software systems are special kind of event-driven activity system that maintain records of entities and events of interest in databases. Some use digital twins to represent structures and behaviors that they simulate.

Databases and digital twins as models

Business activity systems

Businesses depend on human and computer actors interacting in regular activities to create and use data that represents entities and events (customers, products, orders etc.) the business wants to monitor and direct.

Such a business activity system is event-driven. The events trigger actors to perform activities, using resources, to advance the state of the system and transform inputs into required outputs, to meet?given aims. The concept graph below is an informal representation of the systems of interest to enterprise architects.

An event-driven activity system

The activities in such a system:

  • can be modelled as processes (operations, methods or transactions)
  • are triggered by discrete input or time events.
  • produce outputs and/or change the internal state of the system.
  • may process materials
  • may process information (in data flows/messages and data stores/memories)
  • may be specified in an interface or role definition that publicises what the system has the capability to do.

However, a big issue for those who see a business as system is that human actors are not limited to playing roles and performing activities in any definable system. They have lives outside of the business, and within it, they can make decisions, ignore rules given to them and perform actions that nobody could predict or model - as will be discussed in related articles.

Scaling up social systems thinking

Consider the rules of a card game. They describe the structures (dealer, players, cards in the pack, in hands, on the table, stakes) and behaviors (holding, dealing, picking and playing cards, betting) of a particular card game.

We can distinuish types and instances of card games from the enttities that perform them

  • Type: abstract system: the structures and behaviors of a universal card game - the rule book.
  • Instance: real system: the structures and behaviors of a particular card game – in progress.
  • Thing: social entity: a card school, a group actors that ma interact not only in many card game types, but also in eating a pizza.

Ashby envisaged cybernetics could be scaled up from his small examples to large and complex organic and social entities. From

  1. a machine that performs a “single-value closed transformation” (chapter 3) to
  2. a machine configurable by an input parameter to behave this way or that (chapter 4), to
  3. a black box machine with a continuous input stream (chapter 6), to
  4. a machine coupled by input and output to another machine in a wider system, to
  5. a living organism that responds to a variety of discrete input events, to
  6. a social entity.

It is true that very large and complex systems may be designed using Ashby's principles. However, system theories have their limits, especially when it comes to human social systems.

The slow evolution of biological organisms (changing the phenotype of a species, in tiny ways, randomly, and discarding most changes as more harmful than beneficial) is not a good model for the rapid evolution of business organizations. For that, a regulator/controller needs a) awareness of what may change in the environment, b) the ability to detect changes, and c) the know how to change the regulated/control system accordingly.

For a more sociological persecpective of systems thinking, read the two articles on social systems below.

Related articles

When is an entity not well-called a system? When we have applied no system theory to it. When we have no description or model of it as a system, and no prospect of completing one. So, to speak of it as system leaves the listener none the wiser.

If you want to read this article in the context of a book, watch this space. Related articles include:

Note: for those who want to research further by finding Ashby's books on the internet.

In "Design for a Brain" chapters 2 and 14, and "Introduction to Cybernetics" chapters 2, 3, 4 and 6), Ashby's use of words was not consistent. You might interpret his words differently from how I do.

Ashby referred to a physical entity (regardless of any observer, and with countless definable variables) as real machine, or real system, and sometimes a 'machine' with quotes. Be it mechanical, organic or social, I call that an entity.

Ashby referred to particular variables, descriptive of an entity, and selected by an observer as being of interest, as a system. He went on to speak of a system as a regular system that includes both the variables of interest, and any other variables needed to render the whole system “regular”. By regular, I believe he meant rule-bound. I call a set of state variables that advance in a rule-bound way, an abstract system.

Ashby sometimes to referred to the realization of variables by a physical entity as a machine without quotes. I call the realization of an abstract system, a real system.


?



Martin Wollny

Systems thinker interested in change management and getting people to collaborate and co-create. A technical person who thinks like a business person. Researcher of systems corruption and systems optimisation.

2 个月

"Every system is a soft system in the sense it is an observer’s view of what a real world entity does. Checkland noted that people find it difficult to grasp that in behavior of one entity, many different systems (sometimes conflicting) may be observed. This perhaps most obvious in the case of a human social entity. Many systems thinkers (like von Foerster) confuse a social entity in which humans do whatever they choose (like a card school) with a social system in which humans follow the rules that define the system (like a game of poker). They hide this confusion by speaking of the social entity as a “complex system”. The real word entity that you're referring to is THE SYSTEM. If you take implicit specs of that system - that's the specs of the system. Everything else is studying systems and our understanding of it. If the observer is part of the dynamic system under investigation - they are affecting it, because qualitative research methods are used - phenomenology

回复
Martin Wollny

Systems thinker interested in change management and getting people to collaborate and co-create. A technical person who thinks like a business person. Researcher of systems corruption and systems optimisation.

2 个月

"a real system (such as a game of poker) is a real-world instantiation of an abstract system (as defined in a "poker" rule book) by a real world entity (such as a card school). Card games as realizations of abstract systems A card school is a social entity. What a card school does outside the abstract rules of a card game lies outside of the corresponding real system..." Not sure where you're going with this one? A card school is totally unrelated system/entity to the specification and implementation of "Poker" system.

回复
Martin Wollny

Systems thinker interested in change management and getting people to collaborate and co-create. A technical person who thinks like a business person. Researcher of systems corruption and systems optimisation.

2 个月

"A model is both abstract and real. It is a model only in relation to whatever phenomenon it is created or used to model. And it only plays the role of being a model in the processes of being created or used in that way." Reverse engineering a model out of an "envisaged" phenomena is always an approximation.

回复
Martin Wollny

Systems thinker interested in change management and getting people to collaborate and co-create. A technical person who thinks like a business person. Researcher of systems corruption and systems optimisation.

2 个月

"There is a major ambiguity in sociological discussion of systems. Is "the system" a social entity (a group of communicating actors) or a social system (a rule-bound pattern of behavior)?" Splitting hairs - every communication (every relationship) is predicated on rules and patterns - there are protocols, norms, etiquette, conventions, standards...

回复
Martin Wollny

Systems thinker interested in change management and getting people to collaborate and co-create. A technical person who thinks like a business person. Researcher of systems corruption and systems optimisation.

2 个月

"Every system we abstract from observations and envisagings of how the universe works is defined by the particular rules it follows. The trouble is, the rules of one system may conflict with the rules of another. Human actors play roles in biological, psychological and social systems, each with its own rules - be they instinctive, social conventions, business rules or government legislation. Suppose you are a member of two social entities, a church and a business. The church may impose a rule on your behavior that conflicts with a rule of the business. So. when you ..." This is another example of systems corruption - in this case it's a design corruption - the specification not living up to the motivation for the system.

回复

要查看或添加评论,请登录

社区洞察

其他会员也浏览了