In a recent article I touched on the role “tribal knowledge” plays in the decision-making process in many organizations (Do You Think It Or Do You Know It).

Early in my career I had the opportunity to head up a product development group for a major chemical company. We were charged with commercializing what at the time was a radically new coating system into several industrial markets, including the automotive industry. As a result, my team and I spent a lot of time in Detroit, much of it inside the walls of manufacturing facilities, running plant trials and overseeing implementations.

One of these facilities, owned and operated by one of the “Big Three” domestic auto manufacturers, produced polyurethane fascias, or bumpers, for most of the company’s US assembly plants. The fascias were produced using a technique called reaction injection molding, or RIM, primed with a plastic-friendly primer, finished with a base/clear topcoat, and shipped to the destination assembly plans for incorporation onto new vehicles rolling off the assembly line. The company operated on a near-just-in-time basis, so if the fascias didn’t ship the assembly lines were forced to shut down.

The managers and line workers in the facility, by and large, were auto industry veterans. Relying on decades of hands-on experience, they intuitively knew how to do their jobs. They did what they did the way they had done it for years, and took great pride in the fact that their “tried and true” experience-based approach worked. Every day. Sure, there were days when something didn’t work as planned, but they were able to deal with it. In some ways it was akin to riding a bicycle … they could almost do the job on autopilot, they knew it so well. In the learning model made famous by Thomas Gordon (see Is Running Your Company Like Riding A Bicycle) they operated in an “unconscious competence” mode, sometimes dropping down a notch into “conscious competence” when a problem occurred, but most of the time doing what they did by rote.

Then one day, suddenly and without warning, the fascias coming off of the topcoat line exhibited tiny fisheyes, imperfections that, unfortunately, did not show up until the last step in the production process. The fascia manufacturing process shut down. And it wouldn’t be long before the effect would ripple through the supply chain and some assembly lines would be forced to shut down as well. The company had redundant fascia manufacturing capability, but the system simply could not meet the demands of all of the assembly plants.

What was causing the fisheyes? Were there imperfections in the polyurethane substrate? Were there perhaps impurities in the primer or topcoat? Was a contaminant being introduced in one of the curing ovens? Was there some other root cause?

The tribal knowledge that enabled the managers and line workers to do their jobs so well, day in and day out, was woefully inadequate to solve the problem this time. Something had changed in an instant, and the experience that came from years of “we’ve always done it this way” didn’t help.

We were called in, along with experts from other companies, to solve the problem. Time was of the essence. Every day the problem went unresolved would cost the company millions of dollars. So we rolled up our sleeves and got to work. We reviewed existing data, collected new data, contextualized the data, looked for patterns, performed root cause analysis, ruling out the “obvious suspects”. We endeavored fundamentally to understand the principles underlying each step across the value chain. And in a surprising short time we found the answer. A microscopic pinhole had formed in one of the hydraulic lines in the automated paint system, spewing out a fine oil mist, so fine that it was undetectable with the naked eye. The line was replaced, the fisheyes went away, and in due time quality fascias were being shipped again.

So what are the lessons learned from this experience? Unbeknownst to me at the time, we were, in short order, committing to practice what I now understand as the DIKW hierarchy.

Data – Information – Knowledge – Wisdom (or Understanding). While DIKW’s origins are unclear, an early proponent of the model (at least the DIK part) was Nicholas Henry, Professor Emeritus and Former President of Georgia Southern University.[1] While the model has been refined, reinterpreted and applied to various disciplines over the years, a succinct interpretation is as follows:

  • Data consists of facts, signals, stimuli … things you can observe or measure … contextually and informationally neutral.
  • Information is data in context … connecting observations, identifying relationships … contextualizing to derive purpose, utility and meaning.
  • Knowledge is the synthesis of multiple sources of information … identifying patterns … deriving insights.
  • Wisdom is developing an understanding of the principles underlying data, information and knowledge … the “know-why” (why something is) and the “know-what” (knowing what to do in a given situation, whether anticipated or not, because of gained insights) … it is contextually-transferable in that it affords the ability to transfer knowledge to new situations.

Being true to DIKW at its best requires a relentless commitment to identifying biases each step of the way (e.g. selection bias in collecting data and confirmation bias in interpreting information) and working to eliminate them. It is also cyclical and continuous.

So what does all of this have to do with tribal knowledge? Tribal knowledge is experiential, often anecdotal, usually unwritten and derived from many years of hands-on practice. It is often passed down from “generation to generation” in an organization.

On the one hand tribal knowledge can be a valuable asset to an organization. It provides a “short-hand” mechanism for getting things done effectively, leveraging years of real-world, hands-on experience, the things you “can’t read in a book”. It can help foster emotional engagement in the work at hand. It provides a powerful, peer-to-peer knowledge transfer vehicle.

However, tribal knowledge alone, when not combined with a rigorous DIKW discipline, can be dangerous and in some cases fatal to organizations, especially in the ever-changing business environment in which we live. It is inherently bias-prone, relying on highly situational and limited experiential data, subject to confirmation biases, and limited in the ability to convert knowledge to true wisdom and understanding. In a sense, the DIKW cycle is broken and what would otherwise be cyclical and continuous becomes static and unchanging, manifested in a “we’ve always done it that way” mindset. It creates insularity in organizations so they don’t look beyond themselves for insights and learning. More often than not, the focus is on situational consequences of actions without a fundamental understanding of root causes, the “know-why” part of the equation. If you don’t know, beyond the situation you’re in, what the outcomes of a given action are, you don’t have the understanding to effectively respond to situational changes. This often results in bad decisions or the inability to deal with unforeseen problems.

Which takes me back to the auto plant example. Not only did tribal knowledge hinder the ability to solve an unforeseen and previously unexperienced problem on the production floor, it also facilitated a mindset higher up in the management ranks, one that US automotive manufacturers spent years working to overcome. I distinctly remember a “we’ve always done it that way” attitude in the face of the impending threat from Japanese auto manufacturers. This attitude, based largely on tribal knowledge and untested assumptions, resulted in a disinclination to acknowledge the threat and to change in response to it. After working primarily with two of the “Big Three” US companies, I visiting the Honda assembly plant in Marysville, Ohio for the first time and knew at that point that a major transformation was in the works, one that would necessitate a dispensing of tribal knowledge and untested assumptions if an industry were to survive.

Mike Cobb

 

 

 

[1] Henry, Nicholas L. (May–June 1974). “Knowledge Management: A New Concern for Public Administration”. Public Administration Review. 34 (3): 189.

 

X