The main reason for solving the wrong problem is confusion between symptom and disease. It’s often difficult to tell the difference, even for professionals like us. The key is a rigorous diagnosis. However, in the heat of battle, it’s almost impossible to make time for that. With our fires constantly burning, not-enough-time is situation normal and our diagnoses suffer.
On the surface, symptom and disease can be indistinguishable. But it’s worse than that. Since symptoms are superficial, they’re easier to solve. Diseases are deep and fundamental. If there are two disease candidates, we’re drawn to the easier one, the misdiagnosed symptom.
I remember a time when I worked at a multinational that made appliances and other products, and we confused the symptom with the disease. As demand for our products increased, final test at a refrigerator assembly plant was regularly shut down. Refrigerators stacked up in front of a long process and stopped final test.
To keep the line running, the manufacturing manager proposed enlarging the building. On the surface, if production volume skyrockets, a larger building may be the right thing. (Creativity before capital, of course.) But, capital demanded a diagnosis, and the plant manager scratched at the disease.
It turned out refrigerators that failed final test were routed to rework (the long process). When space in front of rework filled up, final test had to stop. In the heat of the moment, a building addition made sense—a straightforward cure to a straightforward disease. But cooler heads prevailed, and we embarked on a long, courageous battle with low-yield disease.
Another important problem with our diagnoses is bias, or our preconceived notions. If you’re a pulmonary specialist, the diagnosis is lung disease. If you’re a statistical process control specialist, it’s standard deviation disease.
Once, when I worked at a startup company developing a novel powdered-metal manufacturing process, we broke an intricate punch. (We did that a lot.) I diagnosed the disease, and told my boss it was a stress problem. The tool steel could not handle the process. He said, “You’re a mechanical engineer with a materials background. How else would you see it?” He was right.
Measurement problems also compromise our diagnoses. We measure what’s easy, not what’s important. Due to workload, we don’t make time to step back and ask, “What’s important to measure?” Instead, we diagnose with the data we have, not the data we want—a recipe for inaccurate diagnoses.
Even when we have the right data, its quality can trip us up. Gauge R&R analyses have helped, but they’re not always used. Usually, we diagnose, fix the wrong problem, perform gauge R&R, and rediagnose. We’ve got to use gauge R&R thinking ritualistically and apply it more broadly.
At the start of our yield improvement journey at the refrigerator plant, our knee-jerk diagnosis was solved with an injection of wider acceptance limits. I suggested we apply gauge R&R to the final test process and retest the failed refrigerators. This was unnatural thinking for the Black Belts at the plant, but they retested the failed units. Some refrigerators still failed, but some passed. That kicked off a project to improve the measurement capability of the final test process, which provided important, high-quality data to battle low-yield disease.
Fast diagnoses are different than good ones. In the long run, good ones are actually faster than fast ones, because they solve the right problem.
Editor’s note: Mike Shipulski is a leading authority on lean manufacturing, product development, and design for manufacturing and assembly. His column will appear every other month, alternating with Austin Weber’s “On Campus.” E-mail Mike with comments via mike@shipulski.com or follow his blog at www.shipulski.com.