navigating invented boundaries

navigating invented boundaries


==

“For wildlife to engage in the business of living, species need to be able to move across the landscape.”
Invented Boundaries, Anthropology News
==
“I don’t really care about stuff, but it isn’t really about stuff is it? No. It generally isn’t about the stuff. Even When it is. And even when the motivator is greed. Really just about having. Marking out your territory, making it bigger and bigger, and giving yourself more corners to put your stake in the ground as though more and bigger will protect you from, well, the sense that you can’t really control anything. So maybe more is what protects you from the lack of control.”

SJ Rozan

==
“Organizations are goal directed, boundary maintaining, and socially constructed systems of human activity.”
Howard Aldrich, sociologist

I almost called this machine and the machine in a technology world, but let me begin with ‘invented boundaries.’ The first time I stumbled into this thought was in Donella Meadows ‘Thinking in Systems.’ Dumbing down her brilliance, basically she suggests a complex system has no real boundaries and we, people, create them in order to make sense of the system (or the situation). Conceptually, this is a really important thought. This means most boundaries are imaginary, not concrete solid structures, to create controlled spaces for humans as protection (or a means to find enough certainty about something to ‘do’ something) against “uncontrollable factors,” or, as a weapon against human's natural imagination and freedom to explore. Either way, it is an attempt to create some definition in a human world within a world which actually defies simple definition. We are surrounded by boundaries. Rules, process, hierarchy, class/caste, playgrounds, fences, pick your environment and I can find you a constraint or boundary. Maybe I am talking about what Robert Ardrey suggested in ‘The Territorial Imperative’ - territoriality as the power of things to impose their own assumption of time and space by means of our sensory involvement in them. All of this gets compounded by whether the boundaries are set by, and for, humans or technology (the machines). I would argue business, in particular, is crafting boundaries and constraints as dictated by machines/technology rather than humans. From there we are off to the “so what do we expect once we have established the boundaries/constraints” races. That said. If there was ever an example of the territorial imperative, I struggle to find a more relevant boundary than growth.

Which leads me to growth (and scale).

Growth is one of business’s territorial imperatives. Yeah. If you buy what I just said, invented boundaries limit growth. That doesn’t mean growth cannot occur, just that because it is bounded, you will never really know how high is up. Interestingly, this also is true of power. The growth of power is less about the machine or technology itself, but rather how it is used and wielded by the will of the institution (and those in power). That said. ‘The machine’ represents a kind of warped ideal that institutions use to guide tangible output objectives. This has consequences. The main one is that is the machine has created a non-human operating context in an environment populated by humans. People live in bounded conditions that, while may not be less-than-human, are certainly not at optimal humanity. This ultimately creates a disaffection from our senses leaving a general sense of being unsatisfied, as a human, all the time. And this is mainly because machines and technology craft invented boundaries around the social aspects of people – the networks and connections. What I mean by that is machines and technology have augmented our ability to ‘distribute’ thoughts, ideas, social connections and a variety of things that add value in the marketplace of people and business. In fact, as Metcalfe’s Law states: “as the number of people involved in any communications technology increases, there is exponential growth in the amount of communication paths.” This is known as the network effect and it has both good and bad properties. It is fairly clear in the not-so-distant future AI will help us make all kinds of decisions. Despite technology’s potential for good, it also has an equal potential to affect people and society in not-as-good ways. Maybe worse, technology and machines can be wielded by those in power to manipulate all those things (and constrain them as they see fit). It may sound simplistic, but if we can invent some boundaries and navigate the technologies between the benefits it can bring and the challenges it can create, society will benefit. Often the challenges get showcased in books, movies and media in ways that often can make the negatives seem exponentially stronger than any technology positives brought to bear. And then we look at business in general as business, focusing on ‘maximize the return on investment’ as it does, will implement things not with the good of society in mind, even as a sub objective, but rather to meet some specific business objectives. Add on to that how business views humans and connectivity as ‘social machines’ to be built and optimized and you begin seeing how humans ger screwed in this bounded world crafted for them. I bring this up because the technology of the future will allow powerful networks of people to coordinate efforts for effective decision-making, matching people to those who can help solve the most challenging problems or exploit the exponential opportunities and even crafting governance (constraints) to complex competitive environments. What I mean by this is we are quickly moving towards a time where it will be possible to use technology to create and use people, as social machines – passive recipients with aggressive intent – created by institutions (the owners of the technology) to produce specific results. Uhm. That’s directed behavior within a bounded environment. This shapes ‘humans to technology’ rather than ‘technology/machine shaped to humans’ ultimately creating a fundamental incapacity to organize people. Yeah. I just said that (despite all the claims technology makes about increasing human potential). I say that because technical/technology imperfections demand technical/technology solutions in this scenario and while humans are demanded to operate the ‘machine’ they do so even if the natural sense/rhythm of humans doesn’t really align with the rules of the automation/technology. The human/machine becomes a weird less-than-optimal interaction between machine and human within some invented boundary.

This gets a bit worse. When things go wrong in an invented boundary space, instead of un-inventing the boundary we try and fix what is within. How? Through machine-like processes. We want to restore meaning and unity and patch together that which machines have stripped. Yet. The only way we seem to do it is through technical means – process, initiatives, tracking, etc. to fix it we seek to make behaviors and attitudes an object of ‘human processes’. Once again, the process imposes a technical solution. Oddly the solution can indeed restore some semblance of meaning and purpose and unity but only by virtue of total integration of the people INTO the process/system which originally produced the issues. We see the ’human issues’ as symptoms of people incomplete of the machine system. Problems are symptoms of people just not being in sync (or ‘complete’ with the system) rather than the system not well designed for optimal human performance.? ?What this means is to achieve the goals we actually mean to complete the process (make the system whole). Yes. We seek to modify humans (behaviors, attitudes) to ‘fix them’ so the system (process) will work the ‘way they are supposed to.” Ponder that as you think through how many times you have heard or even said) “if they would only do this.” I believe it was Jacque Ellul who said “such persons may exist, but it is probable that the ‘joyous robot’ has not yet been born.” In this ‘man versus machine’ world all processes are related to everything. This machine mentality, the machine (processes are always the right way things should be done) means we continuously ask, and assess, people to compensate for the disagreeable components, and consequences, of the system. Typically, businesses do this by isolating different components (specific activity) to ‘fix’ those consequences. Once again, in this scenario humans, and their natural instincts for work and craftmanship, doesn’t really have a presence except to the degree the individual is subject to economic conditions and to the degree that mechanical conditions permit means for which processes can be exercised UPON the humans. The truth is a focus on efficiency (which every business has whether they accept it or not) creates a conflict between human operational acceptance and operational application on humans. Every human process has some version of a designed, circumscribed, sphere of action and none of them ever really match either the human’s actual skills nor their potential. So even though a business may be ‘human-centric’ it is actually a relatively impersonal technical operation when viewed this way. How do I know? Well. Pick a person off the street and the process will never work on them. From a meta narrative perspective, this is determinism, not freedom. It is a psychological war in which the individual must be engaged because the environment/context does not create conditions for the natural will of the individual. Processes have devolved to technical procedures, not traditional thinking, its evolution (exponential) is too rapid for new traditions of thinking (which undergird attitudes and behaviors) to take hold therefore thinking – and norms/principles/etc. - simply holds on for dear life to a technological world running wild. I would be remiss if I didn’t point out that when process is embedded into every aspect of life, including society and community itself, it ceases to be an external stimulus and actually becomes part of each individual’s substance (way of being). The machine is no longer ‘used’ by humans, but absorbed by and has ultimately absorbed in some ways, humans. This gets even worse. The machine represents the ideal toward which ‘techno social’ (or techno productivity) strives so that ideal becomes, well, how people measure life. Machines may have begun the war on humans, but technology is going to win that war.

“Social growth was formerly reflective or instinctive, that is to say, unconscious. But new circumstances (the machine) now compel us to recognize a kind of social development that is rational, intelligent and conscious. We may ask ourselves whether this is the beginning not only of the era of spatially finite world, but also of the era of a conscious world.”
Henri Guitton

Machines sanctioned social inefficiency.”
Lewis Mumford

===

“Humans are not adapted to the world of steel; process adapts him to it.”
Jacques Ellul

?

Which leads me to people & humanness.

Today’s work demands different qualities in a person none really that beneficial to the mental health of the person. It subjects everyone to a very similar way of life, tends to put everyone in the same ‘working bucket,’ puts everyone in similar invented boundaries, and actually threatens/rewards everyone the same way. Working people now work, generally speaking, in a constant state of tension, mental pressure, spiritual absence (meaning, not religion) and a system which almost demands a level of submission (to the institution as well as to “the way things are done in a machine/technology growth system). By being involved and committed to the machine system most people simply then seek ways to contrive support for the system. Worse? Exceptional becomes less of a human quality and more of ‘working the system exceptionally well.” How boring is that shit? So, people go home and into the community emptied of humanity by the machine system and, uhm, what do they talk about? In a business world in which technology creates a world in which almost everything works well, or at least well enough, it diminishes substantive human conversations of life, home, family, etc. and increases a focus on the subconscious hollowness where technology, tinged with fear, anguish, despair seeps into anything that could be positively meaningful. Well. That was a painful sentence to write.

Which leads me to Marshal Durbin on cognitive anthropology in 1973: “Culture is best seen as an asset of control mechanisms – plans, recipes, rules, instructions, which are the principal bases for the specificity of behavior and are essential conditions for governing it.”

I buy that there are some organizing principles underlying cultural behavior and I buy that those principles offer some boundaries or constraints for ‘normal accepted behavior,’ but I chafe on ‘control mechanisms.’ Well. Let me say this. I do not believe culture she be viewed as a control mechanism, but businesses, and institutions in general, are infamous for using culture to attain power thru authority and defining what ‘best’ is; so ‘worst’ can be isolated. Or, circling back to an earlier point, the institution wants to manipulate the malleableness of culture. Institution power seeks to do so because culture is an open system, not a closed system (and those in power hate that). What I mean by that is people are consistently ambiguous and vague and it is extremely difficult to understand what people are thinking when they make up their minds to do something. Therefore, if humans are in a constant mental flux the culture itself is a bit malleable. Uhm. Malleable demands control, or boundaries, in the eyes of any business institution; i.e., time to invent a boundary.

Which leads me to speed.

Speed is violent. Therefore, speed is an enemy of boundaries - invented or not. Maybe I could add that speed is also at war with “humanness,” i.e., the boundaries of what humans can handle. Maybe think about this like car racing – either on a closed track (an oval) or an open track (road race). Each have boundaries, limits, constraints, and effective speed exists within the limits and balancing of technology, humans and nature. Beyond that, the easiest example of this is ubiquitous information. The immediacy of information immediately creates the crisis in reasoning power thereby squeezing and reducing the size of the action so that the action matches the squeezing of time as the field of action. The small bounded space increases the power of almost anyone. An individual with a computer or even a single bot can lead to a speedy catastrophic chain of events. in addition, speed encourages us to ignore the threat of information proliferation and the speed of that proliferation allows any irresponsible individual to borrow or own a piece of that information, at speed, and elevate it beyond any boundaries that everyone else may desire to have in place. This, in turn, creates social conflict between those who seek to preserve a principled ecosystem (some do so to protect their smaller community domain and some do so to protect larger society). This also becomes a conflict of rivalries between information. What I mean by that is that a social group can establish a boundary within which it contains certain information and as the speed increases around that boundary conflict arises as information bombards it. The speed of all this creates a sense of futility. In fact, within the violence of speed, humanity actually stops being diverse in its sense of futility and will, as a consequence, fragments into hopeful communities and despairing communities (and everything inbewteen). That thought is a derivative of a Paul Verilio thought. Anyway. The other way to think about this is using Stewart Brand’s Pace Layering. The outer circle is speed – violently buffeting life, minds, and attention. The inner circle, nature, is the slower cadence of behaviors and habits. Speed is constantly attacking culture and psychological and behavioral insights that give us our natural rhythm. The speed is unnatural, and therein lies the conflicts between the boundaries.

Which leads me to “in the box” effective navigation.

Of course we need to slow down a bit, maybe festina lente, but there are aspects of complexity, even within boundaries, which are important. The truth is there are linear, causal, aspects within complex systems. Maybe call them fractals or maybe call them constraints (gates to be opened). Goldratt explained constraints (Theory of Constraints) and business and people ignore what he outlined decades ago. I say this to suggest probabilities, predictions and scaling are often dependent upon how successfully one can identify constraints and seek out the fractals, linear gateways, the connections, that unlock potential and opportunities.

  • ** note: the structure of networks tied to the systems within an organization matters toward meeting productive ends objectives

?

Yeah. High falutin’ theory or not, everything is about connections. Connections between parts of a system as well as connection between, and of, people. The boundaries help us define the connections and clarity of connections leads to effective navigation.

?

“AI (algorithms) doesn't know that human beings exist at all. From the algorithm's point of view, each person is simply a click history.”
Professor Stuart Russell

?

The truth is the technology is destabilizing to many connections because they stretch existing systems & the ways of thinking. As we’ve seen time and time again technology has a noticeable effect on our social economic and societal institutions. This creates a dramatic shift in the locus of power which affects what happens within the boundaries of the system. What I mean by that is it all may feel like chaos, but it is simply power applied against limited powerful in a rigid invented boundary (within which the powerless have nowhere to go). That said. Without the power brokers who attempt to control outcomes, especially within boundaries, humans naturally converge, not because of processes, but as natural dynamics of a system and human nature. People naturally pursue progress that is visible which begets, well, infinite energy and intrinsic power. This is called operational freedom. If done through process, it is operational totalitarianism. To be clear, some boundaries are important because the more vague the outlines, the more we tend to design processes, and the less freedom we will actually have (even in an open system). That may sound a bit counterintuitive, but systems have both function and purpose, not Purpose, but purpose. It can simply be growth with the ‘how’ left up to the system and the people operating the system.


?

Which leads me to gamification.

“How” is rarely left up to the people operating the system. In fact. “The system”, more often than not, will manipulate people to operate the system the way the system powerbrokers want them to operate it, i. in comes gamification. Gamification, in and of itself, isn’t a bad thing. I actually created an online early education learning platform grounded in gamification. Most online game are grounded in gamification., i.e., fail just often enough to learn how to navigate to the next level (progress). My friend Dr Jason Fox wrote a fabulous book on business gamification called “the game changer.” He offers a wonderful section on something called contextual momentum. ‘It is an outline of how people and organizations can gain some balance between specific goals and open possibilities while still making progress.’ Gamification at its best helps someone manage for the right things at the right level where ultimately the motivation is progress more than some simple short-term races to some specific milestone. Alas. To our mutual dismay both Jason and I watched as the power brokers of the system began to use gamification to invent boundaries to, well, game the operating system. Oh. And game people. Therein lies several problems (outcomewise as well as moralwise). Suffice it to say gamification focuses on parameters to change things and, yet, there is not a lot of leverage in parameters. They are constraints, not expanders. That is not to say parameters aren’t important as things which can set the stage for all the things to come, but they are not ‘the thing’. The truth about gamification is that the structure of the system influences behavior and the structure of the situation influences decisions. It is all related. Parameters only provide some constraints or boundaries for how we view situations (either limits or offers expansion opportunities).

And maybe that is where I will end.

All boundaries, invented or otherwise, influence how we view situations. We are what we see. We do because of what we think we see. Choose your boundaries wisely. Some may actually be cages.

Ponder.

?

?

Dr. Jason Fox

wizard ? “keynote speaker of the year” ? quest leadership + regenerative futures ? (??ヮ?)?*:??

6 个月

Brilliant musings, as ever, btw! This always has me thinking in wonderful new ways.

回复
Dr. Jason Fox

wizard ? “keynote speaker of the year” ? quest leadership + regenerative futures ? (??ヮ?)?*:??

6 个月

“Alas. To our mutual dismay both Jason and I watched as the power brokers of the system began to use gamification to invent boundaries to, well, game the operating system. Oh. And game people.” <~ aye, it is tragic—but as you illuminate, it has expanded our boundaries now, too. And all this talk of boundaries has me thinking of Hermes, the god of boundaries (amongst other things). And of how—in the ladder of abstraction from behaviour to habit to parameters to system to paradigm and beyond—much of what we need to tackle probably exists at the level of incentive (and what backs that incentive). In many senses, the current financial system is our cage; and in picking the lock to this (I’m still working on it) I wonder: what is on the other side? You’ve probably mused upon this much already. It seems to be where I’m at right now.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了