The H.G. Wells school of Process Mapping

The H.G. Wells school of Process Mapping

Anybody reading this has probably been exposed many times to the concept that time is the fourth dimension. But when Wells first introduced this back in 1895 it was novel. No longer were three coordinates in space enough to describe a place or object. The look and feel of the very same place would be markedly different at different times.

And so, dear engineers, is it with process maps. They are static. Look at our blindfolded friend in the picture above. He thoroughly knows the process of crossing the street, but he takes his life in hand at the implementation stage.

That's because a process map doesn't properly consider variability, stochasticity, and exogeny.

Wait, did I lose you? Here's a tip. You can always pick out the simulation modelers within any project because they refer to "Exogenous Stochastic" events. Whereas the less techy people will say something like "external random" events. (And the takeaway at the C-suite will be "shit happens", with a shrug.) But they all mean the same thing. And from the process map point of view, it means that things don't always work out as they were supposed to.

Still, everybody wants to start with how-it's-supposed-to-work (HIStoW); immortalized in a process map (swimlanes, color-coding, et. al., are optional.) The map shows you the rules, but not the exceptions. The intent, not the results.

Process can be like grammar; there are more exceptions than rules.

Now the statistical process control guys (they're easy to spot in their colored, incongruous 6Sigma belts and matching clipboards) are always worried about variability. Different operators have varying skill levels and speeds, etc. The Theory of Constraints crowd are all fussed up about bottlenecks and focusing on the single most critical step. So they can eliminate it and simultaneously give birth to a new, subsequent single most critical step. And the Lean guys are all a-hunting for Muda. Excess travel, excess inventory, overproduction, and so forth.

Funny thing though, a process map doesn't provide any of these key diagnostics. It is a graphic depiction, without scale or volume, of all the pathways a product/activity might use to move through a process. Might, is right. Without gathering performance data you wouldn't know if 99% pass a certain inspection point or if the number is more like 50%, suggesting massive re-work. You wouldn't know something as basic as how long it takes to move product through the process. Or what happens if the raw-material arrival rate varies. Or if machine #3 breaks down more frequently. Or how close machine #3 is to machine #4. Or whether machine #4 is idle half the time because it does its task faster than machine #3 can feed it.

Before you rush at me waving your VSM's and Minitabs, the process engineering version of pitchforks and torches, let me state that I know these features are not the intent of process mapping. But my thesis is that they should be. When process maps were first introduced at the 1921 ASME conference (by John Gilbreth, a personal hero and subject of the book "Cheaper by the Dozen") the only tool we had was paper.

During that time, an engineer or manager also would write longhand and give it to a secretary to 'process' on her elaborate typewriting machine. You don't do that anymore, right? We've been doing our own typing since the late 80's (IBM got rid of its Selectric typewriter business in 1991.) Now your words are 'dynamic' -- you can reformat the whole page, spell/grammar check, cut/paste, add hyperlinks, use markdown -- all features that were impossible with static typed pages.

So why are you still drawing static flowcharts? You're not, you say? Sure, Visio and about a hundred other tools can be used to rapidly create and edit a flowchart. It's still static. We've paved the cowpath and we get to the wrong tool, faster! Isn't progress marvelous?

It's time to add HG Wells to Gilbreth and bring time into flowcharts. Watch items flow, while generating accurate distances and times and throughputs. Allow for variability by adding statistical error rates and production times at each step. Add exogeny by sampling probabilistic product arrival rates, or quality at the front end.

Now, you've got something. And what you've got is called a simulator. You can not only quantify the current state, you can run experiments and test multiple parameters. Every young process engineer knows that, at least in many companies, the process map is an icon -- it's developed, placed on the wall, and saluted. But all the grizzled veterans on the floor ALREADY knew all that stuff - you just spent $50k to take a strange photograph of the rules they're familiar with. It's the what-if experimentation, the optimization potential, the analysis of future scenarios that really adds value to them. (Trainees, and MBA's can really use the maps, though.)

There are many quality tools, but let me just mention three possibles, sampled from very different feature levels.

Apprentice Level Tool - BPS Simulator   This is a newcomer. And its one of the very few simulation tools to be SaaS; its too soon to know if this approach will catch on or not, but it certainly makes upgrades easy! This tool is really inexpensive, the basic plan is free and the Plus Plan (Object library, Import, no ads) is just $5/mo. That said, its lacking a LOT of features. It's a 2D tree structure with AND and XOR logic that's more akin to Boolean mapping or FMEA than Discrete Event Simulation. But I think there's serious potential for growth and, at least, its a reasonably priced introduction. https://www.bpsimulator.com/

Journeyman Level Tool - Process Simulator  This tool, Process Simulator, by Promodel, can be thought of as super-plug-in for Visio that brings existing static flowcharts to life. Right click on process boxes and add simulation parameters.  Then press play and watch the animation as items move along your process lines and generate statistics and control panels. One use case, and what initially attracted me to the product years ago, was that process improvement consultants could use a client's existing Visio flowcharts as a starting point and merely add the simulation properties, processing times, initial conditions, etc. What a massive time saver. But then you have to take into account that very few of these clients' process diagrams are going to be good enough to base your work on. Still, a moving, calculating Visio is better than one which just stares back at you. Process Simulator used to be a little north of $3k, but now seems to be available in a no-risk evaluation version. I don't know for how long. (An installed 32-bit version of Visio is required which will set you back $600 and keeps you from selecting the $15/mo Visio Pro for Office subscription which seems like the better deal. But you may already have Visio.) https://www.promodel.com/Products/ProcessSimulator

Master Level Tool - FlexSim   What the two previous tools require from you is a good deal of abstraction. A true 3D DES tool like FlexSim allows you to build scale models of your facility with the right equipment, environment, directly on an imported floor plan. Thus, distances traveled in your animations will be actual distances, material moved by conveyor will look like material moving on a conveyor (and not a line connecting two boxes on a flowchart). FlexSim can simulate manufacturing, warehousing, material handling, robots, Automated Guided Vehicles, hospitals, logistics -- basically, anywhere where stuff moves. FlexSim has a lot built in but also allows you to import almost any kind of 3D graphic as well as CAD output. Let's say you have a meeting with Acme Manufacturing to sell them your Improverator 3000? Fine. Show up with a demo model that looks like the Acme plant, with the Acme equipment and Acme's business rules. Then show them an "after" version with your Improverator 3000 running and enabling a faster throughput or reduced cycle time. Have a dashboard showing the short ROI time. That's just one application. If you have a major redesign or expansion you should consider testing things in silica. (The glorious thing is that nobody will see your mistakes, just the beautiful final design.) FlexSim is variously priced but, interestingly, they're one of the few companies that honor a competitor's license for an upgrade. https://www.flexsim.com/

Now, please don't move to simulation if you just want to make a representation of the Current State. A flowchart is fine for that. Even if you make a true scaled, photorealistic, 3D version in FlexSim complete with Oculus Rift interface it STILL won't be as nice as the real thing. And the real thing is RIGHT THERE! Just outside your client's door. No, you want to make simulation models for one reason only, experiments. (Now say that again, to yourself, only this time in your best Peter Lorre impression while rubbing your hands together.) Experiments. (good!) Because you already know what your facility or factory is currently doing. What you really need to know is what happens if you replace Machine X with a more expensive new model, or if the MTBF of Machine Y increases, or if orders pick up 20%, or if you move to double shifts, or if you can squeeze 10% out of opex... A correct answer to any of these, and many more, questions might be the difference between business failure and success. And eventually, one of them will be. After all, you're using computer software to balance your cashflow (it's probably called Quicken or SAP, or one of those). Now you can use simulation software to balance your production line.

Mano Kanthanathan

Managing Partner - Simulation Consulting

9 年

Great stuff - because in 3d you get deeper insight into what is going on!

回复
Imre Polgár

Reality & Logic

9 年

This reading is a MUST! Ignoring the time component means ignoring life process! Thanks George!

回复
Cliff King

Retired Partner/Owner, FlexSim Software Products, Inc.

9 年

You nailed it George! Great read!

回复

要查看或添加评论,请登录

George Gonzalez-Rivas的更多文章

社区洞察

其他会员也浏览了