Social systems thinking - part I

Social systems thinking - part I

This first article on social system thinking picks up where my earlier article on "System Theories" left off.

Contents: The emergence of social systems thinking. The weakness of the social-biological analogy. Rapoport's game theory. Senge's system archetypes. Ashby's cybernetics. Ackoff's social system. Checkland's soft system. Beer's POSIWID. Modeling techniques.

Different thinkers use different vocabularies. The vocabulary used in this article is partly a compromise between the vocabularies of Ashby's cybernetics and Ackoff' social systems.

A later article continues the discussion of social systems with reference to "self-organizing systems".

The emergence of social systems thinking

In the 19th century, following developments in the understanding of physical and biological systems, people began looking for patterns in the structures and behaviors (communication, cooperation and conflict) of human society. The first sociologists included:

  • Herbert Spencer (1820-1903) wrote on societies as organic systems.
  • Emile Durkheim (1858-1917) wrote on collective consciousness and culture.
  • Gabriel Tarde (1843-1904) wrote on social systems as emerging from the individual actors, acting as autonomous agents.
  • Max Weber (1864-1920) wrote on bureaucracy, hierarchy, roles and rules.
  • Kurt Lewin (1890-1947) wrote on group dynamics.
  • Lawrence Joseph Henderson (1878-1942) wrote on meaning in communication.

Some drew an analogy between social organizations and biological organisms - discussed below.

Some used the term "system" as a label for a function of government, a human institution, organization or other social entity. That tradition continues today in what are sometimes called management science and organization theory.

In the 1950s, more scientific approaches to systems thinking were developed. One was general system theory, after Ludwig von Bertalanffy and others. (Since Bertalanffy was a biologist, he looked at systems from that viewpoint.) Another was cybernetics (after Norbert Weiner, Ross Ashby and others), of which Forrester's System Dynamics might be seen as a branch.

Most schools of systems thinking that emerged address one or both of two kinds of system.

  • Open activity systems in which interrelated actors or components respond to events and interact to produce effects or results of interest.
  • Closed systems in which interrelated state variable values interact and advance according to a set of rules (as is illustrated in the drawing of causal loop diagrams).

Kenneth Boulding was perhaps the first to wrestle with how to apply general system theory to what he knew as "management science" and some call "organization theory".

Boulding, Bertalanffy and Rapoport founded what became the Society for General Systems Research. However, two decades on, they grew disenchanted with its diversion into management science.

  • “Rapoport’s resignation in 1977 corroborates this interpretation. He took his decision with regard to the predominance [in their Society of] “managerial” applications… Rapoport and Bertalanffy tried to restrain this evolution… and Boulding also actively joined them.” David Pouvreau

Russell Ackoff started his 1971 book with ten cybernetic principles for systems thinking. Over three decades, he shifted ground, from taking a cybernetic view of systems, to being a popular author and speaker on the failings of public sector institutions, due human factors such as the self-interest and ignorance of managers. Others have followed a similar trajectory.

This article outllines some approaches developed in the 20th century (Rapoport's game theory. Senge's system archetypes. Ashby's cybernetics. Ackoff's social system. Checkland's soft system. Beer's POSIWID). But first, let us examine a common analogy.

Beware the sociology / biology analogy

In this talk, Heinz von Foerster said: “an emergent new paradigm in which we look at both biological systems, living organization, and… at the social organization. And these are profound changes in our appreciation, and all the superficial analogies collapse. Not only that they are misleading; they are even dangerous.”

Yes indeed - except that analogy goes back to the 19th century.

Terms used in general system theory include identity, hierarchy, organization and system boundary. Biologists use additional terms, including autopoiesis, informational closure and operational closure. For discussion of these terms, read The sociology – biology analogy

Social systems thinkers have used some terms with different meanings from biologists. Doing this creates two special system theories rather than branches of one general system theory.

There are general patterns of behavior a social entity may conform to, and general structures for organized institutions such as businesses and government agencies. The next two sections look at patterns of behavior.

Rapoport's game theory

The social concepts of cooperation, competition and conflict feature strongly in the concept of a game. Team games feature both cooperation within a team and competition between them. In a market economy, competition is a kind of cooperation between buyers and sellers.

Anatol Rapoport developed game theory to address conflict resolution between individuals and between social entities.

Aside: Wittengstein claimed we cannot define the concept or type "game", since it spans games ranging from solitaire and poker to archery and baseball. Yet in all these games, there is a winning position to be reached by following the rules of the game. So, to "play the game" is to perform particular activities, following particular rules, to reach what participant(s) see as a winning outcome, or as near to that as they find satisfying.

There is another way to look at games. For example: consider the rules of a card game. They describe the structures (dealer, players, cards in the pack, in hands, on the table, stakes) and behaviors (holding, dealing, picking and playing cards, betting) of a particular card game.

We can distinuish types and instances of card games from the enttities that perform them

  • Type: abstract system: the structures and behaviors of a universal card game - the rule book.
  • Instance: real system: the structures and behaviors of a particular card game – in progress.
  • Thing: social entity: a card school, a group actors that ma interact not only in many card game types, but also in eating a pizza.

Senge's system archetypes

To carve a social system out of a social entity is to identify a regular pattern of communication or behavior in how actors interact. In "The Fifth Discipline: The Art and Practice of the Learning Organization" (Doubleday, 1990), Peter Senge identified and modeled several social system archetypes and discussed the need for "interventions" to stop problematic behavior.

Senge's system archetypes

For example, in the system archetype called "the tragedy of the commons", each actor pursues actions which are individually beneficial, but can eventually result in exhaustion of a common resource. This diagram tells a simplied version of the story.

Increasing the number of fishers can make the fish stock increment negative. Whereas grass on common land can grow again, once a fish stock is exhausted, it is gone forever.

Note that two quantities can be related, directly or indirectly, in an amplifying/reinforcing feedback loop, or a dampening/balancing feedback loop.

You might draw CLD of a virus outbreak, or an ant colony. However, it is difficult to verify the truth of such models, for reason listed in the earlier article. And chaos theory tells us forecasts of how system will change state can be wide of the mark.

You can draw a CLD to express your opinion – graphically – of some ways a human social entity might change state. But not only is it difficult to verify the model, but given the staggering complexity of a human society, the CLD you draw is only one of several possible views, and it might soon be out of date.

Ashby's cybernetics

In modeling a cybernetic system, regulator/controller and regulated/controlled entities are bounded subsystems, connected a monitor and control feedback loops, in a wider system.

Ashby supposed that in any situation where we see regulation it will be of a cybernetic form. So, wherever a mechanical, biological or social entity shows regulatory and/or adaptive behavior it will conform to principles discussed in the earlier article, notably.

  • the law of requisite variety
  • the good regulator theorem
  • Ashby' structure for an adaptive system.

In “Design for a Brain”

Ashby set out a minimal theory of regulation, sufficient to explain adaptive behavior, by describing how an entity responds to state changes that deviate from the norm. In his development of the “Homeostat” he demonstrated a structure for an adaptive system that might be called self-organizing.

A role of the nervous system is to regulate the state variables of the body in response to changing conditions. Ashby posited the brain is an organ concerned with adaptation - finding new control strategies when current strategies fail to deliver the desired result - keeping state variables within the limits suitable for life.

Ashby can be misconstrued as implying the brain is the chief executive in control of everything. Cybernetics allows an organism or other entity may have several distinct regulators that operate more or less independently (even several brains, like an octopus).

You may assume the conscious brain sits at the top of a control hierarchy. Experiments have shown that, as well as directing where the body goes, the higher brain constructs rationalizations of autonomic responses.

A living organism must respond to a variety of discrete input events. If it responds to N different kinds of event with reference to different state variables, I wonder: might Ashby see that as N distinct systems – one for each event type?

A social entity is dynamic in a variety of ways. An ant colony is dynamic in the way it advances its state in rule-bound ways. A human social entity is dynamic in other ways. Not only does it continually change state, but actors come and go, and individual actors adapt their behavior to events and conditions they experience.

In “Introduction to Cybernetics” and later

Ashby set out a more general framework. Regulation is either predictive regulation or feedback regulation, and may involve dampening and/or amplifying state change.

  • In chapter 3, Ashby wrote of a machine that performs a “single-value closed transformation”, changing incrementally from one state to the next in a fixed pattern of behavior.
  • In chapter 4, he wrote of a machine with input that can be configured by an input parameter to behave in one way or another.
  • In chapter 6, he wrote of a black box machine with a continuous input stream, then a machine coupled by input and output to another machine in a wider system.

Ashby envisaged his cybernetic principles would scale up from a mechanical entity such as his Homeostat to a larger and more complex entity. He talked about applying the same principles to a living organism, and to a society - here called a social entity.

Earlier, I discussed the weakness of the social-biological analogy, but that is not say cybernetic principles cannot be seen in human society.

Example

Your regular morning ritual is this: 1 wake, 2 dress for work, 3 cook breakfast, 4 eat it, 5 walk to the station, 6 catch the train to work. The current state of this process is between 2 and 3.

Throughout, you also monitor the behavior of things in your environment. You listen to the local radio, which broadcasts local travel updates.

  • Input: you hear on the radio that the train service is suspended.
  • Response: you creatively think of two other options: catch a bus (increasing the bus company revenue), or else call a colleague to ask for a lift (increasing her journey time and petrol costs), consider their pros and cons of option, and then choose the latter option
  • Input: your colleague agrees to pick you up, provided you are ready in 5 minutes.
  • Response: you have a bowl of cereal instead of cooking breakfast.

In Ashby's words you are a “machine with input”. You follow the path of your regular routine, until you receive the radio message. Then, you switch to a second path of weighing the pros and cons of two options, and choosing one of them. On receiving a new input (“be ready in 5 minutes") you switch to a third path (“eat cereal”). The two exception paths emerge on the fly, you did not give any forethought to them.

Abstracting social systems from social entities

Ashby envisaged his cybernetic principles would scale up from a mechanical entity such as his "Homeostat" to a larger and more complex entity. He talked about applying the same principles to a living organism, and to a society - here called a social entity.

In some species (like pandas) individuals mostly live alone. At the other extreme, the actors in insect colonies are each dedicated to play fixed social role. In many species, animals live in a herd, flock or shoal, for some or all their lives.?A shoal of fishes coordinate their movements in a mechanical way that might be modelled as a system. And a school of whales cooperate in sophisticated patterns of hunting behavior.

Where actors interact using quantifiable rules, techniques like System Dynamics and agent-based modeling might be used to represent the system.

Ashby wrote also of qualitative feedback. Perhaps the most universal social system in animal species is a mating ritual. Ashby discussed a mating ritual described by Tinbergen. The sticklebacks communicate by sending and receiving information in visual signals that inform and direct activity.

A progressive dialog

Progressive dialogues are commonplace in human activity systems, in formal and legal protocols, and in the user interfaces (or use cases) of software applications.

Mapping my terms to Ashby's terms

In “Design for a Brain” (DfB) chapters 2 and 14, and “Introduction to Cybernetics” (ItC) chapters 2, 3, 4 and 6, Ashby used words inconsistently, but in my reading.

  • My systems thinker plays the role of Ashby’s observer.
  • My social entity (describable using countless variables) is what Ashby typically called a real machine, or real system.
  • My abstract social system (a set of state variables and state transition rules) is what Ashby called a system, or regular system.
  • My real social system (the realization of an abstract system by a social entity) is what Ashby sometimes called a machine.

Ackoff's social system

Russel Ackoff (1919-2009) set out to bring the socioloogical and scientific views of a "system" together, especially with regard to a human enterprise or business.

In “Towards a System of Systems Concepts” (1971), Ackoff started some distance from human society, with ten terms and concepts that a cybernetician like Ashby would recognize. In the first three points Ackoff distinguished abstract systems from the physical entities that realize them. And like Checkland and Senge, he spoke of systems as perspectives we can model.

  • System: two or more interrelated objects (parts or elements).
  • Abstract system:?a system in which the elements are concepts.? “Different observers of the same phenomena may conceptualize them into different systems”.
  • Concrete system:?a system that has two or more objects.

Later, Ackoff spoke almost entirely of concrete systems. In the next three points, Ackoff spoke of properties that can be represented as state variable values.

  • System state:?the values of system properties (state variables) at a particular time.
  • System environment:?those elements and their properties (outside the system) that can change the state of the system, or be changed by the system.
  • System environment state:?the values of environment properties at a particular time.

Later, Ackoff made little reference to state variables. In the next four points, Ackoff spoke of systems being open or closed, and static or dynamic.

  • Closed system:?one that does not interact with its environment.
  • System/environment event:?a change to the system's state variable values.
  • Static (one state)?system:?a system to which no events occur (it does not change).
  • Dynamic (multi state) system:?a system to which events occur (its state changes).

Ackoff’s main concern was with dynamic, event-driven systems that change state. His “system” embraced both mechanical devices and human organizations.

The systems thinking journey made by Russell Ackoffover four decades, took him from taking a cybernetic view of social entities to making observations on the workings of human organizations. He became a popular author and speaker on the failures of organizations, especially public sector institutions.

He built elaborate hierarchies of aims and activities and distinguished four kinds of system. His classification of these four kinds changed over the years. By 1999 it was based on purposefulness and by 2003, it had evolved into a form that was based on choice .

Ackoff's final classification of system types

By 2003, Ackoff was talking about concrete social entities rather than abstract systems, or even realizations of abstract systems. If the actors in a social entity choose their own actions and aims, if there is no regularity or repetition, no pattern of behavior we can model, then what is systematic or systemic about that social entity? And does the fact we make choices prove we have free will? These are questions are discussed in later articles.

Further reading: Ackoff's system thinking journey.

Checkland's soft system

Peter Checkland developed his Soft Systems Methodology to model the behavior of business activity systems.

Checkland said “an observer engaged in systems research will give an account of the world, or part of it, in systems terms; his purpose in so doing; his definition of his system or systems; the principle which makes them coherent entities; the means and mechanisms by which they tend to maintain their integrity; their boundaries, inputs, outputs, and components; their structure.” Checkland (1981, p. 102.)

To paraphrase, Checkland’s system (aka transformation) might be defined as a structure of components that interact to consume inputs from, and deliver outputs to, each other and external entities, and maintain the integrity of the entity's internal state.

Checkland complained his students struggle to understand what he meant by a system: "they get it one day and lose it the next". His soft system is not a comprehensive description of a named business or other social entity. It is an observer’s view of what that entity is and does; it is only a perspective (a Weltanschauung, Checkland called it) of that entity, and its place in the wider world.

For example, to speak of a cruise ship as a system has no useful meaning. Observers with different interests might abstract

  • a transport system,
  • an oil-fired combustion system,
  • a command and control system,
  • an entertainment system,
  • the biological system of each person on board, and
  • the social activity system you observe in the ship's casino.
  • a system for moving money from the bank accounts of passengers into the accounts of the ship owners and employees.

Since we can abstract many different activity systems from the extraordinarily complex behavior of one social entity, and since we model systems with particular interests in mind (to solve a given problem, or meet a given requirement) every abstract system might be called "soft system".

Beer's POSIWID

Stafford Beer applied Ashby's cybernetic ideas to business management.

  • "Purpose Of a System Is What It Does (POSIWID) is a systems thinking heuristic coined by Stafford Beer, who observed there is "no point in claiming that the purpose of a system is to do what it constantly fails to do." Wikipedia 20/9/2024

To talk about the inexorable outcome of an organic system (a leaf, or a heart beat) as its purpose feels OK. But to talk about the inexorable outcome of an inorganic, but natural system (like a hurricane) as its purpose feels wrong.

For sure, a designed system may fail, or result in contrary "unintended consequences". But it makes no sense to equate its purpose (an intention or aim) with its outcome (or result).

Also:

  • The purposes of directors, workers and system desgners may differ, and be compatible or in conflict with each other.
  • The purposes of any system we study are only a fraction of the purposes of the wider business in which that system is realized.
  • In cybernetics and soft systems, any two observers of one business may see different systems, with different purposes, some designed to meet purposes; other evolve without them.

What we mean by “system"?

Again, broadly speaking, two kinds of system may be defined.

An open activity system in which interrelated actors or components respond to events and interact to produce effects or results of interest.

A business is a social entity who business operations may maintain tens of thousands of (qualitiative and quantitive) state variables, and realize many distnct activitty systems of interest. To modle the whole as a one coherent event-driven activity system is simply beyond us.

A closed system in which interrelated state variable values interact and advance according to a set of rules (as is illustrated in the drawing of causal loop diagrams).

Maanagers may be taught to selectively define a few quantiative goals/variables ("key performance indicators") to assess the performance of the organization unit and/or logical function or capabilty they are accountable for. And to cascade these goals down the herarchical organization structure (as in a "balanced score card").

The track record of this approach is questionable, for reasons not discussed here. And that hierarchy of measurable variables is not a "system" of variables in the sense of cybernetics or System Dynamics.

What do we mean by "purpose"?

The term "purpose" may be used in various ways. We should distinguish the aims of individual actors, and of a whole social entity or business, from the outcomes of each system that they (more or less) realize.

  • Purpose (psycho-biological): an aim of a living actor, an intent that steers it behavior. It may be entirely personal, or else correspond to a purpose of the next kind.
  • Purpose (psycho-sociological): an aim of a social entity in which actors play roles. It may be given as a direction to the entity and/or agreed by members. The entity may fail to meet the aim, or else suceeed because it corresponds to a purpose of the next kind.
  • Purpose (Beer's cybernetics): an outcome of a particular pattern of behavior - regardless of whether the entity that performs it is organic or inorganic.

Surely, POSIWID is best interpreted as saying these three kinds of "purpose" are sometimes aligned, and sometimes not?

Modeling techniques

Being animate and more or less autonomous, the actors in a social entity are sometimes called “agents”. Given a particular interest in a social entity, and using your chosen system theory, you might abstract and model a social system of various kinds. Some of these viewpoints below were mentioned above.

  • To model an I/O transformation, you might draw a material/data flow diagram, which is verifiable, but cannot predict how a system will change state.
  • To model how competing or conflicting social entities can reach the most optimal outcome, you might use game theory.
  • To model a multitude of interacting agents, you might build and run an agent-based model (ABM) to see how the system evolves.
  • To see what could happen over time to macro state variables of interest, you might build and run a quantified System Dynamics (SD) model.
  • You might integrate ABM and SD models.
  • You might draw a CLD (an informal abstraction of an SD model) and compare it with a social system archetype of the kind defined by Senge and others.

Remarks

Some of today’s social system thinking authors, such as Michael (Christopher) Jackson, have brilliant minds, and have much to say that people find valuable. But some can mislead readers into well-nigh equating system theory (about described systems) with management science (about managing and motivating people).

I suggest we should be cautious about referring to any human social entity as a social system. It is a way of talking that is lazy and can be misleading. If you do it, what do you mean?

Is it an open system that transforms inputs into outputs? Only one such system can be found in the social entity? The transformation is rule-bound, regardless of who the actors are? The transformation is determined by actors within the system?

Or is it a closed system of inter-related variables? Only one such system can be found in the social entity? Its state changes are rule-bound, regardless of who the actors are? Its state changes are determined by actors within the system? It is stable, maintains it state within a given range? It is progressive, may exhaust a resource and crash? In short systems thinkers <create and use> abstract social systems to <represent> real social systems they <observe and envisage> in real world entities.

Social systems in politics

Capitalism is a self-organizing and distributed syystem that relates demand, supply, and the prices of products and services.

Marxism is a system that divides society into two social entities, oppressor and oppressed, and pitches into an endless struggle against the former.

Among those who impose a hierarchical order on human society, fascists and Marxists are remarkably similar.

Both believe in a social hierarchy, authoritarian government, suppression of opposition, suppression of free speech, and subordination of individual interests for the perceived common good. This description fitted the “far right” movements of the 1930s, today, it fits the “far left” (and theocracies) as well if not better.

"Identity politics" groups people in ways that don't even amount to a social entity, let alone a social system. It arrange groups in neo-post Marxist ay in a “hierarchy of victimhood” as though all white people, all men, all in the West (etc.) are to be blamed, entirely, for any measure by which any other named group perceives itself to be less well off. This divisive view of society might equally be called fascist or racist.

Laws imposed on human society must be kept under review, because of a) the unforeseen consequences that follow, b) people change their behavior to take advantage of or circumvent whatever laws are devised, and c) ?“Power tends to corrupt, and absolute power corrupts absolutely” (Lord Acton).

The people must have free speech, and the power to kick out the established the goverment. And the establishment must keep changing the rules, incrementally rather than radically.

Remarks on effectiveness

You can find reports of

  • Game theory used in conflict resolution.
  • A system archetype used to explain why a pattern of behavior is counter-productive, and prompt an intervention.
  • Cybernetics used to design a control system that will adapt (at some level) to changing circumstances.
  • Soft systems mehodology used in stakeholder management to uncover different perspectives, and design a business activity system in which activities are monitored and directed.
  • Beer's ideas used by business management consultants.

I have heard anecdotal accounts of their use. I don’t know they are widely used, and have no measure of their effectiveness in use or relative to each other.

We have some indication of success from anecdotal examples, but no objective measure as far as I know. The difficulties in measuring the effectiveness are the difficulties facing much social science - viz. how to disentange the effectiveness of a method from the ability and experience of the people who apply it? And, how to measure what would have happened if a different method (or no method) was followed?

Related articles

If you want to read this article in the context of a book, watch this space. Related articles include:



要查看或添加评论,请登录

社区洞察

其他会员也浏览了