PLM & OCM Ch. 8: Decomposition Creates Dysfunction

PLM & OCM Ch. 8: Decomposition Creates Dysfunction

“Prof, I really want to thank you for all the time we spend on this.”

“No worries. I have a lot of fun with it. I have a lot to cover in class and can’t get too deep in any topic, so this gives me a chance to explore some themes.”

“I appreciate it.”

“Careful what you wish for. I want to talk about predictive and adaptive, mass and lean production, and tie it into the context of Christensen’s disruptive innovation. Do you have a coffee? Are you ready?”

“Sure.”

“Ok. Here we go.”

Disruption

According to Disruptive Innovations - Christensen Institute, there are three tenets to disruption:

  • An enabling technology,
  • Which reaches an underserved market,
  • In which everyone benefits because they and others have joined the market.

This last tenet gets to network effects. Think of the early days of Facebook; every time someone new joined, every existing member benefited, and the new members benefitted if they could get other friends to join. A disruption requires some critical mass of early adopters, but think about the first 10,000 users who each sign up two of their friends. Suddenly there were 30,000 users, then 3 million, and now I think it’s around 3 billion. Another phrase for this is ‘systemic reinforcement’.

Mass Production

There was a nascent automobile industry prior to Henry Ford, but each vehicle was handmade for its owner, and the primary job of a chauffeur was to keep the car serviced. Before Ford, every part had to be hammered into place because no two pieces of metal were ever shaped the same. Cars were expensive to produce, expensive to maintain, and only the very wealthy could afford them, meaning the automobile couldn’t reach an underserved market and create critical mass.

Ford’s enabling technology was in a new class of machine tools which could work on hardened steel, meaning the metal wouldn’t change its shape from one vehicle to the next. This allowed for interchangeable parts, and suddenly any engine would fit into any Model-T, and any wheel would fit onto any hub.

Wheels on Assembly Line

Manufacturing was easier, meaning that prices could lower to the point that farmers could purchase cars and tractors. Further, farmers could repair the vehicles themselves, using tools they already had, and a replacement part ordered from a catalog and delivered by mail would fit the broken tractor sitting in the barn. The expense of the first car or tractor was more than offset by the additional revenue gained through growing and delivering more food, which allowed the farm to expand and purchase even more cars and tractors. The market for automobiles increased from about two thousand per year in 1910, to two million in 1920. This in turn led to an agricultural revolution, meaning less human labor was needed per quantity of food delivered, and people could live further from their sources of food, in the cities and eventually suburbs. Mass production was a manufacturing force adopted in nearly every industry, not least the US defense industry in World War II.

Quality Lapses

But in the post-war period, US automakers could not maintain quality as production volume increased. During final assembly, every component must be available at the time it is needed, and if any item needed to be scrapped, the entire assembly line had to wait until a replacement was found. Suppliers manufactured excess inventories and assumed that some parts would be scrapped, as this seemed simpler than fixing the root cause of thousands of occasional manufacturing problems. But the overall quality of the final product suffered.

In a Ford plant in the 1920’s as many as 50 languages might be spoken, and not all workers spoke English. Assembly line work was decomposed into small tasks whose instructions could be easily translated, and each worker specialized in the 60-seconds of work required of his station and told not to worry about whatever was happening on his left or right. (Nearly all workers were male.) Shortly, I will discuss the importance of collective learning, which was available to Toyota, but this was not available to Ford due to the issues of language and differing cultural norms. And there was no strong desire to change, as mass production seemed to be working very well if the process accepted a certain amount of scrap.? But of course, this scrap became more and more problematic as cars added new features, meaning even more parts, and more possibilities for the assembly line to stop. In Fordist plants, the overwhelming focus was to keep the line running, and an employee could be fired for intentionally stopping it.


Model T Assembly Line (1934)

The mass production paradigm of narrowly focused, highly specialized work with little cross collaboration expanded beyond the manufacturing floor and into the engineering offices, where an engineer who designed a door latch may never meet the engineer who designed the post to which the door latches. Broadly speaking, this continues today as product designs and project implementations are decomposed into components, with individuals or small team expected to fulfill individual requirements with little understanding of how the nuances of their decisions impact others.

Compartmentalization further expanded under Alfred Sloan of General Motors. While Henry Ford built a single business from the ground up and kept many of its day-to-day workings in his head, Sloan managed a conglomerate of existing auto companies, hence the name “General” Motors. Sloan created divisions and departments, assigned heads to each, and measured their effectiveness based on a few financial metrics, such as revenue, earnings, and market share. The heads were incentivized to make their numbers look good, but in managing a highly complex and rapidly growing organization it became possible for the numbers to look better than were objectively accurate.

Decomposition vs. Quality

There are efforts to address the failings of mass production, but in class we talk about the GM Ignition Switch recall of 2014, and the Boeing 737 Max crashes of 2018 and 2019, where cultural inhibitions to share information and a lack of collective learning led to scandals, hundreds of deaths, and billions of dollars in losses. You can find a lecture on both here: Systemic Complexity (patrickhillberg.com). Scandalous product failures, and many failures which do not reach the level of scandal, originate with a top-down distribution of requirements recursively decomposed into divisions and departments, suppliers, and specialists, who are not encouraged to cross-functionally collaborate and collectively learn from each other. Look at the words that companies use to describe themselves, like “division”, “department”, and “component”. These are all words which imply separation.

This brings us back to the DARPA slide from our last meeting:

And here is another slide, which reflects my own feelings on the topic. We begin in the top-left with our expectation of what will be valuable in the top-right, but it’s a big project, and no one can be expected to hold all of it in their heads at once, so the project is decomposed into pieces. But this then leads to an increased need for communication, and a seemingly infinite number of interminable meetings.


Imagine a meeting filled with people having individual goals and feeling pressed to meet a schedule. They pay little attention to others as they are concerned about achieving their own goals and have little incentive to collaborate. A top-down decomposed environment prioritizing individual performance metrics is antithetical to cross-functional collaboration. As I’ll get to shortly, the Ford-Sloan mass production paradigm is antithetical to strategic PLM.

A lack of authentic collaboration inhibits quality, and as problems become worse, individuals and decomposed teams will argue over who is ‘right’. ?Since each team was tasked with achieving a different goal, they may each have accomplished what they set out to do, and yet still the product fails to perform. This leads to further recrimination and cultural walls, where individuals are more concerned about social acceptance within their small team, than with achieving a larger systemic whole.

Of course, in all products there are requirements which must be met such as those regarding government regulations and personal safety. But as seen in scandalous product recalls, and many lesser product failings, decomposition creates the dysfunctions which lead to quality issues, and the organizational pressure to meet predicted goals eventually leads to unethical decisions within siloed groups. The Boeing and GM scandals are examples of the ethical conflicts that product developers face.

Lean, and the Toyota Production System

In the post-war period, US and European automakers leveraged dozens of assembly plants, each specializing in a particular model which was manufactured hundreds of times per day. Product runs would last months or years.

In contrast, Japan had very little manufacturing capacity remaining at the end of the War, but a strong cultural desire to rebuild the nation, and governmental mandates which restricted foreign competitors. Toyota’s Eiji Toyoda and his head of production, Taiichi Ohno, recognized a market for a few thousand vehicles per year, but had little by way of manufacturing resources. They had only a small number of manufacturing lines and needed them to make short runs of a product, then switch quickly to run a different model.

Ohno developed an enabling technology in Toyota’s ability to change stamping dies in just a few minutes, as opposed to the weeks it might take in a Fordist plant. This is documented the seminal book on Lean Manufacturing, The Machine that Changed the World:

Ohno perfected his technique for quick changes. By the late 1950s, he had reduced the time required to change dies from a day to an astonishing three minutes and eliminated the need for die-change specialists. In the process, he made an unexpected discovery—it cost less per part to make small batches of stampings than to run enormous lots. There were two reasons for this phenomenon. Making small batches eliminated the carrying cost of the huge inventories of finished parts that mass-production systems required. Even more important, making only a few parts before assembling them into a car caused stamping mistakes to show up almost instantly.

In Fordist plants (a term used in the book) poor quality components were considered an acceptable loss in the goal to never to stop the assembly line. In Toyota plants, poor quality was considered a reason to stop the line and look for root causes of problems.

Unlike Ford’s plants, the homogeneous nature of the Japanese workforce proved beneficial for Toyota, as workers spoke a single language and were intrinsically motivated to help Japan recover after the war. This provided an environment for groups of cross-functional workers (known as ‘quality circles’) who collectively learned to solve systemic problems. Imagine two stamping machines which formed mating parts, and one machine had developed an occasional wobble. In Ohno’s plants, parts would be assembled shortly after being stamped, and if a worker noticed that the parts weren’t mating well, it was his responsibility to shut down the line and collect the people needed to find the machine and correct the wobble.

Toyota Production System

The key point in the early development of Lean methods is that a lowering of work-in-process inventories forced problems to be quickly exposed, and...

... Toyota developed a cultural imperative of collective learning through short-cycle feedback loops.

In the Fordist method, the stamping machine operators, as well as the engineers who designed the machine’s dies, worked in silos, couldn’t see the impact of their actions, nor could they quickly and collectively learn from each other. In the Ohno method, everyone was responsible to, and communicated with, everyone else. Toyota workers were encouraged to stop the line to address problems, while stopping the line in a Fordist plant was a reason for termination. Yet in practice, Toyota’s assembly lines stopped far less frequently than the Fordist plants. ?

Using Agile to Enable Collective Learning

Adaptive methods like Agile and Scrum are the Lean response to the predictive and decomposed project management approaches like the Vee and Waterfall. The purpose of Agile is to lower the inventory of up-front requirements and develop a cultural imperative to collectively learn from each other in real time.

The following two slides express this point.


In the first, the grey Vee shows how the entirety of high-level requirements are defined at the top-left, decomposed as the project moves towards the vertex, with components tested in ever-larger groupings as the project moves up towards the top-right. But the downward arrow questions if systemic quality can be maintained as the work is decomposed, and the horizontal arrow questions if the client will continue to value, at the top-right, what was initially required in the top-left. The point being that the project’s ability to achieve the vision will suffer through decomposition, and that its perceived value will degrade over time, as the customer learns to want new things. This is especially true if the project is innovative, meaning the client was not in an early position to understand what they would find valuable later.

The next slide is a different perspective on the same concept, and originates from the book Essential Scrum, by Kenneth Rubin:

Again, the predictive approach is to first gather requirements and then develop functionality, as represented by the solid blue lines. But as in mass production, this creates a problem of an excess inventory of poor-quality requirements. By the end of the project, teams will face both a decomposition failure, in which the actual solution fails to meet what was required (indicated by the blue target), as well as a scope failure, in which the customer no longer wants what had been agreed upon at the beginning (the green target). And at this point, there is frequently an argument over what had been agreed upon.

PLM Strategy as the Antithesis to Mass Production

We’ve discussed PLM strategy as being distinct from PLM technology. If we strategically wish to take a holistic approach to managing the product lifecycle, through Ideation, Creation, Realization, Support, and Dispose, we cannot rely on the mass production paradigm of decomposing small tasks onto specialists and expect the puzzle pieces to come together as expected. The whole will be less than the sum of the parts.

Strategically, PLM is an extension of the Lean approach of shutting down the line to find the root problem, and PLM technology enables the strategy through visualization, simulation, and cross-functional collaboration.

Or at least it could. Unfortunately, the management methods of Ford and Sloan continue to permeate business thinking. If your current business methods emphasize and promote divisiveness through individualized metrics, PLM technology may provide value by removing some inefficiencies. But truly leveraging a Product Lifecycle Strategy, as part of a Digital Transformation, requires a systemic rethinking of how your client’s business will, in the future, bring value to its customers, its stakeholders, and society at large.

Take for example the Boeing 737 Max crashes. There was an established design paradigm, adopted by all the aircraft makers and approved by the FAA, that a pilot was expected to identify any problem within 1 second, and solve the problem 3 seconds later, leading to an expectation that any problem could be solved in 4 seconds. However, ergonomic studies performed in the decades prior to the crashes showed that the assumption was not accurate, and it could take pilots much longer than 4 seconds to resolve a problem.

A PLM technology approach has been used for decades by the aircraft manufacturers to make existing processes more efficient. But to strategically Manage Product Lifecycles - to avoid scandalous failures - airlines and manufactures would record every pilot action in real-life situations to test the validity of the 4-second assumption. This would have challenged business norms across the industry but could also have avoided a scandalous product failure.

We’ll pick up this later, in a discussion on Digital Twins.

Use PLM and Agile to Correct the Errors of Mass Production

This discussion has conflated two seemingly different topics:

  • How to strategically manage product lifecycles, and
  • How to lead a PLM technology implementation team

I posit that these concepts are overlapping and that the purpose of PLM technology is to enable a PLM strategy whose purpose is to increase cross-functional collaboration and collective learning through the many stakeholders. PLM technology provides tools for visualization, simulation, managing part and configuration changes, procurement releases, and other functions related to the bill of materials. These are all important PLM technology functions, but a PLM strategy is about using the tools to overcome organizational dysfunction.

Complicated products are subject to organizational complexity, meaning that we can’t understand the product without also understanding the organization which created it. Will PLM technology be used strategically? A test of a company’s cross-functional capabilities can be found (or not) in its change management practices. A low-cost change in one area may lead to high-cost changes in others. Is everyone informed? Are they surprised?? Do they willingly promote the need for change, even if that change conflicts with departmental goals?

By compartmenting goals within divisions, is it possible to manage a cohesive product lifecycle?

Strategic PLM is intended to overcome a fundamental failing of mass production, organizational separation, and its resulting dysfunction. If the organization wants to provide better products, it needs to create a better culture.

And finally, it’s important that you manage your PLM technology implementation in a manner consistent with the promise of PLM strategy. You need to walk the talk of cross-functional collaboration and collective learning. Predictive and decomposed approaches worked well a century ago for Henry Ford and Alfred Sloan, but they have reached their limits, and realistically your client will need to change.

Wrapping up

Prof asked, “Are you still with me?”

“Uh, what was the middle thing, again?”

“Ha! So, you’ve seen A Fish Called Wanda? I love that movie.”

“Yes, ages ago, but that’s my go-to line when my brain is overwhelmed. I really appreciate this… the whole thing is too much for my audience on Friday, but I’d like to use some of your slides, if that’s okay.”

“Of course.”

“I see your point, and the large-scale concepts are beginning to come together. It gives me a long-term vision to work from, and I’m sure I’ll introduce these topics as the project moves on. It provides a backing for why we’ll use Agile to implement the technology.”

“Good! My work is done.”

“You have me thinking about my own performance reviews within Spacely. Every year I get a set of objectives, which are based on my boss’ objectives, which are based on her boss’ objectives, and on up the line. But evaluating the objectives seems pretty subjective, and they seem unrelated to the work that I know needs to be done.”

“Let’s come back to that sometime, but the consulting firm Deloitte found something similar, and gave up on the recursive cascade of performance objectives. In that far distant future where you have ten minutes, take a look at this article: Reinventing Performance Management (hbr.org).”

“Thanks much. Talk soon?”

“Set up a meeting with me anytime. Good luck!”

“Thanks!”

Learning Goals

  • Mass Production, as developed by Ford and Sloan, requires the decomposing of work into small components, and creating individualized incentives to accomplish a large quantity of small tasks.
  • Lean Manufacturing recognizes that decomposition creates dysfunction and overcomes this by eliminating the inventories which hide problems so that the problems can be solved through collective learning.
  • Predictive approaches, like Vee and Waterfall, are analogous to mass production, while adaptive approaches, like Agile and Scrum, have analogies to Lean.
  • Lean, Agile, and Strategic PLM can be used to overcome organizational dysfunction, but the adoption of PLM technology faces headwinds, as the strategy challenges cultural norms.


Next: PLM & OCM Ch.9: Running an Agile Project via Collective Learning | LinkedIn

Bob Goodman

Award-Winning UX & Product Strategy Executive | Alum: Virgin, Microsoft, Havas | Creating Clarity From Complexity | Building High-Impact Teams

6 个月
回复
Patrick Hillberg Ph.D.

Adjunct Professor @ Oakland University | Product Lifecycle Management (PLM) | Speaker, Consultant, Expert Witness | Advocate for Workforce Development | Ex-Siemens PLM

1 年

I received this question in a DM: "is all decomposition bad or is only too much decomposition too bad?" FWIW - I originally used the aphorism "Decomposition is a Necessary Evil", but we discuss this in my first lecture of the semester, (before I understand any religious sensitivities) so I deprecated 'Evil' in favor of 'Dysfunction'. Decomposition is necessary, and it will necessarily create dysfunction. No single person can understand the complexity of our modern products (even bananas may go through a 5-layer supply chain) but managers should presume that decomposing work onto specialists will create hidden dysfunction; the antidote to which is Collective Learning. (Will discuss soon.) The second video in the link below is fun (and not mine), about a pedestrian bridge in London that swayed uncomfortably when it first opened to the public. At about 18:30, math professor Steven Strogatz, discusses how 'reductionism', (aka decomposition) has been fundamental to the scientific method, but fails to address systemic complexity. Systems are more complex than the summation of their parts, ergo 'Decomposition Creates Dysfunction'. https://www.patrickhillberg.com/post/systemic-complexity

要查看或添加评论,请登录

社区洞察

其他会员也浏览了