Deming’s Journey to Profound Knowledge - How Deming Helped Win a War, Altered the Face of Industry, and Holds the Key to Our Future - Preface
For a complete version of this book please visit:
Deming's Journey to Profound Knowledge
I pulled on a thread and found a fascinating tapestry.
My professional career started in 1980, just as New York was coming out of one of the worst financial times since the Great Depression. The joke was you couldn’t get a job with IBM, J.P. Morgan, or Grumman without inheriting it. So, at just nineteen years old, I headed to Texas to get in on the oil boom. I had only a duffle bag and my incredibly efficient and reliable 1975 Toyota Corolla. During my first week in Texas, I found a job with Exxon Corporation as a computer programmer in exploration and research. The eighties were a fascinating time to work at Exxon, which had a rich culture of leadership and best practices. Although I couldn’t have known it at the time, Exxon’s leadership was my first introduction to Dr. Deming’s principles. Working with some of the world’s top geophysicists, I was indoctrinated in the principles of systems thinking and the scientific method. These principles would shape not only the successes of my life but those of some of the greatest organizations in the world. A decade later, I went to work at GE. As I earned my Six Sigma* Green Belt, I had no idea that what I was doing came directly from Dr. Deming’s teachings. GE had its own analytical statistics department. It seemed like my entire job revolved around control charts, a Deming hallmark. The core lessons I learned around cooperation, experimentation, and systems thinking—all rooted in Deming’s teachings—deeply resonated with me as I continued my career path. While I had unknowingly learned much of his teachings, my knowledge of Dr. W. Edwards Deming didn’t begin until the 2000s. I had started working with best-selling author and award-winning CTO Gene Kim in 2009 on The DevOps Handbook, along with coauthors Jez Humble and Patrick Debois. Before joining the project, Gene had asked me to read The Goal by supply chain management guru Eliyahu Goldratt. After absorbing it, I quickly read his other books: The Theory of Constraints, Critical Chain, It’s Not Luck, and Necessary but Not Sufficient. Let’s just say that after reading his books I was all in on Goldratt.
At a DevOps Days conference in 2011, my friend and mentor Ben Rock- wood, a pioneer in internet engineering, was running an open discussion on Goldratt. During the discussion, Ben intimated that Goldratt was heavily influenced by someone called William Edwards Deming. I didn’t know who the guy was, and I wasn’t looking forward to learning about someone who might shake my faith in Goldratt. But true to his nature, Ben challenged me to at least read Deming’s 14 Points for Management. When I did, I was floored. I realized that almost everything Deming was saying was the foundation for the three major software movements I’d experienced in my life: Lean software development, Agile development, and DevOps.
What amazed me even more was the fact that Deming had written his 14 Points in the 1980s, years before these software movements occurred. Over the next few years, I came to be heavily influenced by the “Prophet of Quality,” as he’s often known. The more I learned about him, the more I wanted to know. It seemed like every little thread I pulled revealed more and more of just how fascinatingly complex the man’s life and thinking were. During the course of coauthoring Beyond The Phoenix Project with Gene Kim in 2017, I stepped up my research on Deming. I wanted to truly understand how he’d come to the epiphanies that seemed to predict organizational success or failure in nearly any organization or system. What events littered along his life’s path helped him discover the universal System of Profound Knowledge? I felt that to understand Deming’s philosophy, it was critical to understand the roots and catalysts of his ideas. I’ve spent over a decade learning about Deming’s life and teachings, and I’ve become something of an expert in the process. To this day, I still find myself peeling back the layers of Deming’s onion as I learn more about those who influenced him, such as the scientists and philosophers C. I. Lewis, Percy Williams Bridgman, and Bertrand Russell.
Unfortunately, of the more than two dozen books about Deming I have read, none chronicle how specific events and inspirations in his life directly connect with the four elements of Profound Knowledge. They were either biographies or explanations about how to apply his principles. None told the journey of how his ideas were developed. I decided that was the book I needed to write, a book that connected the unique moments in Deming’s life that culminated in his grand unifying theory of management that is the predictor of success or failure in every organization today: the System of Profound Knowledge.
During the COVID-19 pandemic, I finally found the opportunity to sit down and write. Before the pandemic, I typically traveled about two hundred thousand miles a year. But with lockdown, I suddenly had an extra fifty hours a month of prime productivity time. They say if you really want to know a subject, write a book about it. That’s certainly been true for me. I only thought I knew about Deming before. But pulling on the multiple threads of his life has given me profound respect for his thinking, accomplishments, and influence. He is like a cross between Albert Einstein and Forrest Gump: seemingly always in the right place at the right time but brilliant enough to take what he sees and experiences and use it to change the world around him. What’s more, the stories about the lives of those surrounding him were wonderfully entertaining and insightful. I wanted to write a book that captured the full picture of his life and his influence, a systems-thinking portrait instead of a book hyper focused on a singular piece of the whole. After all, systems thinking is one of the four elements of Profound Knowledge (as you’ll learn about later). One of my favorite authors is Michael Lewis. When reading Moneyball, for example, you think you’re reading a book about baseball statistics, but by the time you finish, you find that you’ve read a biography of Billy Beane. Similarly, while this book may look like a biography of Deming, it’s the story behind the story of his masterwork, which he shared with the world when he was ninety-three years old. Imagine publishing your magnum opus at that age, just before your death. That gives you a clue as to the kind of man you’re dealing with.
In Bill Bryson’s A Walk in the Woods, you read not only the chronicles of his hike through the Appalachian Trail but also stories of history, scandals, federal agencies, and the tire warehouse that’s been burning for decades. Similarly, this book tells the untold stories of those in Deming’s life, from a survivor of Japanese oppression who was the catalyst for Deming’s coming to Japan to Doris Quinn heading quality education at MD Anderson Cancer Center and helping Deming with his theory of psychology. These untold stories provide additional insight into Deming’s discovery of Profound Knowledge. While Deming’s influence is far and wide, it is most directly visible in four major nationwide efforts: the Aberdeen Proving Grounds (trying to out-manufacture the Axis powers during World War II), the Japanese Economic Miracle (their economic recovery after World War II), the American quality revolution of the 1980s, and, most recently, in the race to develop and distribute vaccines for COVID-19. As we look to what’s next, you will find we need Deming’s System of Profound Knowledge to face one of the biggest threats to the world today: cyberterrorism. The last four chapters of this book deal with understanding the severity of the cyber crisis and how Deming can save us yet again. I’ve enjoyed the journey of bringing this book to you, and I hope you enjoy this labor of love.
Deming’s Journey to Profound Knowledge - How Deming Helped Win a War, Altered the Face of Industry, and Holds the Key to Our Future - Introduction: What Ed Said
The black-and-white clip playing across the tiny screen shows the aftermath of a bitter conflict. A literal war zone. A city of millions reduced to rubble and ashes. Half its population lost. The caption reads: “TOKYO, 1945.”
In the summer of 1980, when that clip aired on TV, Kevin Cahill was a twenty-year-old boy living with his grandparents in Washington, DC. He’d come back to DC to work before he began his sophomore year at UCLA. Kevin had been perplexed by a phone call from his mother weeks earlier.
She could barely contain her excitement as she proudly told him that his grand- father was to appear in a prime-time NBC News special. Kevin’s grandfather had always shunned the spotlight, so she extracted a promise from Kevin that he would make his grandfather watch it. But why? Why would millions of people be interested in my quiet, gentle, hard-working grandfather? he wondered.
When he asked his grandfather directly, all Kevin got were polite deflections and a quick change of subject. The man was generally quiet and reserved but not downright secretive.
The special episode’s name didn’t help explain his grandfather’s involvement: “If Japan Can . . . Why Can’t We?” On the other hand, anyone who heard the episode’s name knew exactly what it was about: the Japanese takeover of American industry.
Whereas the rest of the industrialized nations of the world lay in ruins after World War II, the US was left virtually untouched. As the only game in town, US industry reigned supreme. Factories couldn’t churn out cars, radios, and other manufactured goods fast enough. Quality wasn’t a concern. The only real challenge was keeping up with global demand. America entered what is commonly referred to as the “Golden Age of Capitalism.” From 1948 to about 1970, the nation ruled supreme. Its economy, manufacturing sectors, military, and ability to shape history and world politics were second to none. It was a heady time to call yourself an American.
The seventies knocked the US off its pedestal. The USSR dominated the 1972 Munich Olympics, whereas the US went home nearly empty-handed. Despite Nixon proclaiming Vietnam a success, the prolonged war and withdrawal demoralized the military and America itself. A few months later, the 1973 oil crisis, engineered by a handful of developing countries, brought the most powerful nation on earth to a halt. And finally, the Iranian hostage crisis embarrassed the US on a global stage. Nationalistic feelings of pride, superiority, and modern manifest destiny had given way to uncertainty, anger, and fear. And Japan . . .
Outside observers called it the Japanese economic miracle, and for good reason. Upon its surrender to the Allied Forces in 1945, Japan was a ghost of its former self, its people on the brink of starvation. A significant portion of its industrial capacity had been wiped out. Not only had the entire country been bombed to nearly nothing, but it didn’t have the means to rebuild. What meager production it could muster was of such low quality that “Made in Japan” became a joke the world over.
After the war, the US stayed in Japan to oversee the dismantling of the Japanese military. From 1945 to 1952 the US’s mission in Japan was simply to help it survive. It wasn’t until US policy shifted in 1947 (known as the Reverse Course) that Japan began to rebuild itself. The US brought in several experts to advise the new Japanese government and what little remained of its industry. By 1968, just twenty-three years after the country had been decimated, no one laughed when the island nation surpassed West Germany to become the largest economy in the world after the US, a position it would hold for over forty years.
By the seventies, the phrase “Made in Japan” conjured images of advanced technologies, the best electronics, the most reliable appliances, and the highest-quality cars. The oil crisis spurred many Americans to buy foreign cars over domestic. With better gas mileage, greater dependability, and superior engineering and craftsmanship, a Toyota topped a Ford, GM, or Chrysler in every way. The land of Henry Ford and John D. Rockefeller was no longer the manufacturing capital of the world.
Americans could tolerate losing the battle for electronics and just about everything else, but Americans have a special relationship with cars. You can almost hear the average Joe muttering, “Well, at least we still have our cars.”
Losing dominance in the manufacturing of cars, it seems, was the final straw (or the wake-up call, depending on how you looked at it). That’s when the everyday, red-blooded American realized the country was in trouble.
How had this happened? How had “Made in Japan” gone from joke to juggernaut? How had the once vanquished country come to usurp its conqueror?
It was astonishing. It was impossible. It was nothing short of a miracle. From business leaders to politicians to factory floor hands, everyone shared the same bewilderment. They began to ask the question: If Japan can do it, why can’t the US? And for Kevin, the question was, What did my grandfather have to do with any of this?
On the day “If Japan Can . . . Why Can’t We?” was to air, Kevin dutifully went to the cramped basement of his grandparents’ little brownstone. There sat his grandfather, almost eighty years old, at his desk working with the vigor and determination of someone a fraction of his age. Being the voracious reader and lifelong learner he was, it’s possible Kevin’s grandfather had that day’s edition of The Washington Post with the op-ed that read: Have you looked at the economic news lately and wondered who really did win World War II? Somebody at NBC News evidently did, and came up with “If Japan Can . . .Why Can’t We?”—an “NBC White Paper” on Japan’s burgeoning productivity and our lagging one—to be aired tonight at 9:30 on Channel 4.
It is a thoughtful, often depressing and sometimes fascinating examination of what makes and maintains a work ethic, and why we may end up freezing to death in the dark but the Japanese won’t. Kevin and his grandfather climbed the narrow, rickety staircase to join his grandmother and great-aunt around the tiny TV. The program began with the aforementioned black-and-white clip of the ruins of Tokyo in 1945, followed by another black-and-white clip from the formal surrender of Japan. Next, the screen showed an industrial smelter pouring liquid metal with the caption “TOKYO, 1980.” Then the images on the screen flipped in rapid succession, showing scenes of busy factories and electronics labs, automated robots and cars rolling off assembly lines—the very images that might spring to mind whenever anyone mentioned Japan in the 1980s. Then the overlay of the episode’s title: “IF JAPAN CAN . . .Why Can’t We?”
Suddenly, Kevin’s grandfather, Dr. William Edwards Deming, appeared on the screen. In his quiet, measured tone, he asked, “What can we do to work smarter—not harder?”
While the other people watching nearly burst with pride, Kevin’s grandfather seemed embarrassed. He made as if to go back to his basement office to keep working, but his family cajoled him into watching the rest. Nearly halfway through the program, there had been no further mention of Deming. It was “an uncomfortable thirty minutes,” as Kevin would later note. Then came a Japanese manager giving a speech . . .
Productivity gains were taught to us by Americans. We are very fortunate to have America as a good teacher and we always try to be a very good student, and that’s what made it possible for us to be somewhat competitive in an international market with US industries. At the words “somewhat competitive,” the audience began to laugh. The speaker—Joji Arai, manager of the Japan Productivity Center—was being modest and humble. At this time, Japanese manufacturers outclassed their US counterparts to the point it was laughable. Literally.
The next cutaway changed everything.
One second, laughter at Mr. Arai’s understatement of the century. The next, a slow, solitary voice that everyone around the TV knew well: “The first time that I went there to teach industry, I taught four hundred-and-fifty engineers in several cities. Tokyo, Nagoya, and Fukuoka . . .”
As the screen showed clips of the family’s beloved, gentle giant smiling and shaking hands with Japanese executives, the voiceover of narrator-reporter Lloyd Dobyns explained:
W. Edwards Deming first went to Japan in 1950 to teach industrial productivity through statistical analysis. He was so successful that Japan’s annual award for productivity is called “the Deming Prize.” It is one of the most coveted awards in Japan and the medal that goes with the award is a profile of Dr. Deming—an American.
. . . We have said several times that much of what the Japanese are doing is what we taught them to do. And the man who did most of the teaching is W. Edwards Deming. Some say this NBC special was the beginning of the quality revolution in America. At the very least, it brought the topic from the fringes to the mainstream. In just seventy-five minutes, it upended how the US and the world saw business and industry, sparking a wholesale adoption of Japanese methods and management. It dispelled many of the myths and misunderstandings surrounding the Japanese economic miracle and revealed one of its miracle makers.
No sooner had the documentary concluded than the telephone began to ring. For Kevin’s grandfather, life was never quite the same after that.
For the next thirteen years, Ed (as he was called by those close to him) traveled from coast to coast, delivering lectures on productivity and management. Ford, GM, Xerox, Procter & Gamble, AT&T, The New York Times—it seemed everyone wanted a seat at the feet of “the master.” But history has a funny way of repeating itself. The lectures he gave were almost the same ones he’d delivered thirty years prior in Japan in the 1950s . . . and in the US back in the 1940s.
Yes, Ed had been down this road forty years ago, right after the US joined the fight against the Axis Powers in World War II. At that time, the country had to ramp up its industrial production of everything. From battleships and bombers to boots and bandages, the Allied Powers needed as much as possible as fast as possible, and there could be no compromise on quality. A defective washing machine meant drying the clothes on the line; a jammed gun might mean death.
The Allies didn’t win because of D-day or the atomic bomb. The Axis powers didn’t lose because of a misstep or overreaching. Victory came because the US outproduced the rest of the world. They achieved this despite the absence of millions of skilled American workers and experienced managers, who were on the front lines. It’s no stretch to say that the Allies won because of the quality produced by Rosie the Riveter. Rosie out-manufactured her male predecessors. And she did this using something called statistical process control (SPC).
Starting at Stanford University during the war, Deming trained over two thousand people in statistical process control methods. They, in turn, taught thirty thousand additional trainers. These thousands upon thousands of statistical process control evangelists went forth and spread the gospel, as it were, to Rosie’s supervisors and Rosie herself. The Allies won because of Rosie, and Rosie’s stunning success had Deming’s fingerprints all over it. Then GI Joe came home and once again donned his business suit or factory coveralls. He took one look at how the war was won and said, “Forget all that—we’re going back to the way we’ve always done it.” It took over thirty years to realize the mistake in throwing out Deming’s teachings. In the NBC special, Lloyd Dobyns said of Deming, “In his own country he is not widely recognized.”
After the war, Deming had been shunned by his own countrymen. He looked elsewhere for eager students . . . and there was no one more eager than the Japanese.
So what exactly had Deming done in Japan? How did he bring about the economic miracle that had the US on its knees? He shared a collection of fundamental truths that show how any system or process can be transformed into something greater, what he would later call the System of Profound Knowledge.
Ed began his journey as a mathematical physicist right at the time Einstein’s and others’ theories about the nature of the universe were coming into vogue. This gave Ed an appreciation for the complexity of reality and was a clue that eventually put him onto one piece of Profound Knowledge, an appreciation for systems.
In his thirties, Ed found a mentor, Dr. Walter Shewhart, who introduced him to pragmatism and a theory of knowledge. Essentially, this school of thought approached the world via the scientific method, constantly testing ideas and reevaluating hypotheses. Shewhart also grounded Deming in a theory of variation. In his work with physics, Deming already knew that the very nature of reality is random. From Shewhart, Ed solidified his thinking around variation, seeing randomness as inherent to any system or process, from stuffing envelopes to predicting radioactive isotope decay. Variability is a fact of life.
After World War II ended, Ed traveled to Japan to help with nationwide rebuilding efforts. By this time, he was well grounded in three pieces of Profound Knowledge: knowledge, variation, and systems thinking. But it was in Japan that he gained an appreciation for the final cornerstone of Profound Knowledge: a theory of psychology. In the Japanese, Ed found a culture of inherent respect between manager and employee. In truth, Japan influenced Ed as much as Ed influenced Japan.
For instance, Toyota’s world-class approach to business—called the Toyota Way—is a beautiful fusion of Eastern and Western ideas, bringing together and bringing out the best in both. By this time, Japan was an economic juggernaut, and American businesses were eager to learn from their Eastern counterparts. In his eighties, Ed was finally getting his due. Just after he passed away in 1993, Deming’s book The New Economics was published. In it, he presented his masterwork, the culmination of his life’s experiences. He brought together all four pieces of Profound Knowledge and named it the System of Profound Knowledge (SoPK).
Deming’s System of Profound Knowledge encompasses four elements and includes fourteen points of management and seven deadly diseases of Management.
These four elements of Profound Knowledge are:
-
A Theory of Knowledge: How do we know what we believe we know?
-
A Theory of Variation: How do we analyze and understand what we know?
-
A Theory of Psychology: How do we account for human behavior?
-
An Appreciation of Systems/Systems Thinking: Are we seeing the bigger picture?
Armed with this lens—these four ways of seeing the world—any person or entity can achieve transformational change in any system or process. In other words, this lens is a proven way to make the world a better place. And as Deming said, these four elements are not something he made up. Rather, they are fundamental truths that he discovered along his life’s path, just like the Theory of Gravity or the Theory of Relativity.
From the balance of power in the US Capitol to NASCAR racing and globalization, the ripples of his work seem almost endless. While his story is fascinating by itself, this book isn’t strictly about his life. Rather, it’s the story of the gift he gave the world: a way of thinking that can be applied to any facet of life or work. When Ed worked with Ford Motor Company, he didn’t try to fix specific problems, although he often did in the course of his true aim: to embed the System of Profound Knowledge in the minds of everyone who worked there. Ed’s mission was to work himself out of a job. He wanted to equip the people inside the company with the tools they needed to profoundly change the way Ford worked.
When he stood before the collective remaining industrial base of Japan in 1950, he didn’t try to fix individual companies’ problems. He taught them principles and gave them a different way of thinking about the work they did each day. He didn’t want them to change their practices so much as he wanted to change their mindsets. It was the same with American manufacturers during World War II. It wasn’t enough that everyone pulled together to create as many war supplies as possible. The workforce and entire business had significantly changed, requiring a massive shift in how they operated day to day. The same thing happened forty years later: the American economy had profoundly changed, requiring a change in how organizations operated.
Deming’s System of Profound Knowledge is about learning how to bring about profound change on your own. That’s why, even three decades after his death, we’re still using his teachings as we head into the unknowns of the future.
I’m a software developer: I can tell you horror stories about cyberterrorists in the Digital Wild West. We’ve never before faced what we are facing today, and we need help figuring out how to deal with it. I and millions of others use Ed’s methods to arrive at profound insights we otherwise would have never found on our own.
This book chronicles not only the arc of Ed’s life but that of his thinking as well. The roots of the System of Profound Knowledge began even before he was born and reached a beautiful culmination right at the time he went to college. Had he been born a few years earlier, I’m not sure he would have been as exposed to a new kind of thinking (quantum physics) as he was.
Had he not been raised in a hardscrabble life and interned at the cutting-edge social experiment that was Hawthorne Works, I don’t know that his system would have been as humane and human centered as it came to be.
Had he not taken a job as a mathematical physicist, he might not have had the opportunity to learn from the world’s foremost expert on variation and how it shows up in absolutely every facet of existence.
Had he not been an expert in statistical surveys, he wouldn’t have had the opportunity to travel to Japan, and especially not at the crucial moment of a devastated and demoralized country trying to rebuild its economy, looking for hope and inspiration.
This book is truly about how the lens of Profound Knowledge was found.
It just so happens that its discoverer was a man called Ed.
Deming’s Journey to Profound Knowledge - How Deming Helped Win a War, Altered the Face of Industry, and Holds the Key to Our Future - Part 1 - Chapter 1: Humble Origins and Non-Determinism
Deming’s one childhood claim to fame was when Buffalo Bill recognized him in the crowd during a performance of “Buffalo Bill’s Wild West Show” outside Los Angeles where “Edwards,” as his family called him, was visiting his cousins.
The notoriously flamboyant showman knew the boy from Cody, Wyoming, by sight if not by name. Buffalo Bill was arguably the most world-famous living American at the time, having extensively toured the US and then Europe, performing before Queen Victoria herself twice. His act made him not only famous but rich. But Buffalo Bill wanted to be taken seriously as a legitimate businessman instead of a circus act. He used his money and influence in an enterprise to create the largest undertaking of its kind ever attempted in the West: using irrigation to create an agricultural empire stretching along the Shoshone River. He began by incorporating a small town in 1896 about fifty miles east of Yellowstone National Park, which Colonel William Frederick “Buffalo Bill” Cody humbly named after himself. Life in and around Cody, Wyoming, revolved around Bill, and Bill was usually in and around “the sweetest hotel that ever was.”1
Named after his daughter, the Irma Hotel not only housed travelers but served as Bill’s headquarters, comprising two personal suites and his professional offices. His hotel was, in effect, the heart of his endeavor. When he built it, he envisaged something akin to an African city serving as a staging point for safari expeditions, perhaps like Zanzibar or Mombasa. He foresaw hosting European nobility hunting big game, East Coast financiers looking for potential investments, and opportunists from everywhere investigating the mining and ranching possibilities. The bread and butter of the Irma, however, would be the promised flood of newcomers flocking to settle the soon-to-be-verdant plains around Cody.
Shortly after the Irma opened in 1902, the Demings arrived from Sioux City, Iowa. The thirty-three-year-old father, William, had been trained as a law clerk but now sought to make his fortune on the frontier. He arranged for temporary employment with an attorney in Cody, then moved his wife and two toddling boys from the breadbasket of America to the barren badlands of Wyoming. The town was still in its infancy. There wasn’t enough legal work to keep the young father employed full time, so he found a job at the Irma as a sort of jack-of-all-trades. In addition to his wages, the hotel provided him and his family with a small house on the grounds. Edwards and his brother became regular fixtures at the Irma. Thus, Buffalo Bill recognized Edwards and his little brother at the LA performance.
Unfortunately for Buffalo Bill, the Cody-based irrigation empire failed. But in 1905, the federal government began a massive public works project via the US Reclamation Service aimed at irrigating ninety thousand acres to turn the semi-arid Bighorn Basin into fertile farms. This necessitated the construction of the Shoshone River Dam twenty-five miles northeast of Cody, around the settlement of Powell. Once completed, the concrete-arch gravity dam—itself a predecessor to the Hoover Dam—was the tallest dam in the world.
The area around Powell was opened to homesteaders, and, in 1906, Mr. William Deming applied for and received forty acres of farmland on the edge of town. Or at least what everyone hoped would be farmland one day. In the meantime, the Demings eked out what living they could out where the Great Plains meet the Rocky Mountains.
Decades later, Ed recalled his family’s hardscrabble life. “Our house in Powell, roughly 1908 to 1912, was a tarpaper shack about the size of a freight car. . . . Electricity and indoor plumbing were out of the question. Snow blew in through the cracks in the door and in the windows.” He recollected owning a cat at the time that slept with him and his brother, keeping them warm at night.
The Shoshone River Dam (also known as Buffalo Bill Dam) was completed in 1910, but with or without irrigation, Deming’s father was never a successful farmer. He once remarked, “A farmer makes his money on the farm and spends it in town. An agriculturalist makes his money in town and spends it on the farm . . . . I’m an agriculturalist.” The eggs, milk, and vegetables that kept the family alive came from their chickens, cow, and garden. Despite their efforts, it wasn’t always enough. Late in his life, Dr. Deming would recall his childhood. “I remember my mother, tak- ing my brother and me by the hand, prayed for food.”
Elizabeth, his younger sister and the first baby born in Powell, later noted, “We didn’t have much, but nobody had anything.” To make ends meet, William continued to do some legal work in the area while Mrs. Pluma Deming, neé Edwards, taught piano and voice lessons on her Steinway parlor grand piano. William later began traveling, selling real estate and insurance. Over time, his business grew enough that the Demings were able to move out of the little tar shack on the prairie and into a slightly better home.
Though poor, his parents were well educated and poured their knowledge (and, perhaps, thirst for more) into their children. Edwards was raised in an atmosphere that included both left-brain and right-brain learning—his mother provided the right-brain perspective— the synthetic and creative aspects of learning . . . through music, while his father tended more toward the left-brain (cognitive) perspective. The atmosphere created by Deming’s parents served as the basis for his intellectual achievements and quite likely spurred the qualities which contributed to his success—an intense work ethic, devotion to spouse and family, a love of music. This environment set the course for Deming’s life. His hard-knocks upbringing gave him a unique perspective on the on-the-ground reality of the working class that managers—especially those from more privileged backgrounds—couldn’t appreciate. And the atmosphere his parents fostered helped him became a philomath: a lifelong learner and boundary spanner, a master of engineering and statistics in addition to a musician, composer, and linguist. This all resulted in a young man who was serious, studious, and diligent. The family even came to (prophetically) nickname him “the Professor.”
Newton’s Apple and Schrödinger’s Cat Set the Stage The year before the Demings moved from the Irma to the tar shack in Powell, a theoretical physicist published four groundbreaking papers that challenged the laws of physics (and would be instrumental in framing much of the thinking in Deming’s Profound Knowledge). One of the papers held the seeds of what would become the world’s most famous formula, E=mc2.
Before Einstein, everyone relied on Sir Isaac Newton’s explanations of how the physical world works (e.g., for every action there’s an equal and opposite reaction, objects in motion tend to stay in motion unless acted upon by an outside force, etc.). Einstein showed the world that physics is more like the Korean DMZ: Newton’s laws applied only in certain jurisdictions. Past that border was a whole other world. That border was the atom, the basic building block of all matter. Newtonian physics governed the apple: If you drop it, it will fall to the ground. However, once you get to the quantum level, everything goes squirrely. You can’t be certain what subatomic particles will do. For example, from the subatomic perspective, if you drop an apple, it may or may not hit the ground. It’s enough to make your head hurt.
Others quickly built on Einstein’s work throughout the 1920s, including Niels Bohr (who later worked on the Manhattan Project) and Erwin Schrödinger. But these physicists’ discoveries were just pieces of a much bigger shift: the rise of non-determinism.
Before Einstein published his famous E=mc2, and his following Nobel prize for his discovery of the law of the photoelectric effect, the world was seen through the lens of determinism (Newtonian Physics): “If I drop this apple, it will fall.” In simpler terms, the world operates solely on cause and effect. Take the weather, for instance. Decades ago, meteorologists believed that if they knew all the variables, such as humidity, wind direction, barometric pressure, etc., they could predict the weather with 100% accuracy. In fact, the earliest computers were created expressly for calculating all these variables. But even with the advances in technology we have today, meteorologists are never spot-on all the time. Even if we can calculate every variable, there’s still randomness, non-determinism.
Non-determinism also has roots in Charles Darwin’s Theory of Evolution: If you cross a black cow with a black cow, the offspring will probably be another black cow . . . but a gene might mutate and result in a two-headed white calf. You just can’t ever be certain. This is what physicist Max Planck (the father of quantum mechanics), Einstein, and others observed: No matter how much you know, there is an infinite amount of chance and randomness in the universe. Therefore, there can be no such thing as absolute certainty; the world is constantly in flux. This academic environment prepared Deming to pursue probability and statistics, a cornerstone to his Theory of Variation.
Niels Bohr and Werner Heisenberg took this idea of infinite variability to its extreme with the Copenhagen interpretation, which states that a quantum particle does not exist in one state or another, but in all of its possible states at the same time.
Erwin Schrödinger gave us an easy way to understand how those two physicists saw the way the world works (the difference between deterministic thinking and non-deterministic thinking). Say you put a cat in a sealed box. Inside, there are two items. One is a can of poisonous gas. The other is a radioactive isotope giving off gamma rays. When the isotope decays and releases gamma rays, it triggers the poison gas. The cat dies. The catch is you can’t predict when the isotope will decay.
Radioactive decay can be random; no two isotopes decay at the same rate. It could decay in a minute or in a thousand years. Therefore, you’ll never know when the isotope in the box will decay, triggering the poisonous gas. According to Bohr and Heisenberg, since you can’t predict when the element will decay, you can never be sure at any given moment whether the cat is dead or alive. Until you open the box to see for yourself, you have to simultaneously assume that the cat is alive and that it’s dead. Schrödinger’s thought experiment here was to show the absurdity of those physicists’ extreme and extremely theoretical view.
While this cat-in-the-box concept is funny, it illustrates how these two schools of thought differed. Determinism saw the world in black and white, cause and effect. With enough information, you could control any situation.
Non-determinism sees the world in shades of gray. Everything has an element of randomness. Much of how the world works is unknowable. Mathematical formulas don’t always hold true; we can’t accurately predict the future. We can only speak in probabilities: “The apple will more than likely hit the ground, but we can’t say that with 100% certainty.”
This idea of non-determinism—that reality is inherently random—would form the basis of Deming’s worldview when he began his academic career. It taught him to see the world as a series of interconnected systems, sparking the beginning of his questioning knowledge and leading to the first element of the System of Profound Knowledge: How do we know what we know?
Let’s look at a real-life example of non-determinism.
Post–World War II, the island of Borneo in Southeast Asia had a serious malaria problem. In 1952, the World Health Organization (WHO) of the newly formed United Nations sent antimalarial experts to address the situation. One of the primary carriers of malaria is mosquitoes. Over the next three years, the WHO sprayed the chemical pesticide DDT on interior surfaces in the village longhouses, each of which housed about a hundred families. After malaria cases sharply declined, the WHO declared the mission accomplished and proceeded to host a world assembly in Mexico City to extol the virtues of DDT.
Five years after the conference, Borneo started raining cats. Literally.
And not just any cats. These were special cats: twenty-three rat catchers that floated down in their very own little cat parachutes from a British Royal Air Force transport plane.
The cats’ mission: to replenish the island’s feline population. What happened to the native cats? As it turns out, DDT had killed more than mosquitoes. Later autopsies revealed that the WHO’s practices resulted in lethal amounts of DDT accumulating in cats. Without their natural predator, the rat population exploded. Rats don’t just eat crops; they carry diseases. In Borneo’s case, typhus and sylvatic plague (the same bacterium that caused the bubonic plague of Black Plague fame). Nature could reset the ratio of cats to rats elsewhere, perhaps, but Borneo is an island. If all the cats die, there are no more cats. To remedy the WHO’s mistake, the RAF flew twenty-three cats (plus three tons of food and supplies), blessed them to “go forth and multiply,” and let ’er rip.
I imagine it was a carnival of carnivorous and carnal delight. The good folks at the WHO made an honest mistake. After all, the Western education system stresses analytical, deterministic thinking. In this case, it led to this line of reasoning:
- Malaria is bad in Borneo.
- Malaria is carried by mosquitoes.
- DDT kills mosquitoes.
- Therefore, we should use DDT to kill the mosquitoes in Borneo.
Cats killing rats and therefore keeping typhus at manageable levels is what Donella Meadows, in Thinking in Systems, calls a balancing feedback loop. However, when the cat population was out of balance, the natural order of things oscillated, creating what she describes as an overshoot of a reinforcing feed-back loop.
If the WHO had embraced non-deterministic thinking, they would have taken a much wider view of the problem. The opposite of analytic thinking is systems thinking (a.k.a. appreciation of a system): the ability to see how one thing is part of a larger, connected system. Someone who approached the ecosystem as a system might have thought along these lines:
- Malaria is bad in Borneo.
- Malaria is carried by mosquitoes.
- DDT kills mosquitoes . . . but what else could it kill?
- What else would spraying DDT on the inside of longhouses affect?
- Do we have enough information to make an overall decision?
- We should hold back until we can be reasonably sure we’re going to make things better and not worse for the people of Borneo.
The WHO focused only on the immediate problem and failed to consider how one “solution” might trigger a chain reaction. They failed to see the whole system. This is exactly what I meant earlier about Profound Knowledge: profound change requires Profound Knowledge, and one of the tenets of Profound Knowledge is systems thinking, an ability to see the situation in its greater context.
Determinism and analytical thinking break down a problem into tiny pieces, whereas non-determinism and systems thinking look at a problem’s bigger picture.
Analytical thinkers say, “Mission accomplished. Now, let’s go home.”
Systems thinkers say, “What were the results? Now, let’s make it even better.”
This was bleeding-edge thinking when a sixteen-year-old’s train rolled into Laramie, Wyoming. “The Professor” was going to college.
Ed, as he would come to introduce himself, was used to shouldering a heavy load. He expected things to be no different at the University of Wyoming. In fact, he decided to major in electrical engineering. Electricity at the time was still at the forefront of technological progress, so this was like majoring in artificial intelligence or quantum computing today.
As he studied electrical engineering over the next four years, he supported himself by working as a janitor, shoveling snow, and cutting ice. He also cut railroad ties and worked at a dry cleaner. At some point, he was a soda jerk serving up malted milkshakes. On top of working and studying, he also sang in a church choir and played the piccolo in the university’s marching band. This blue-collar work ethic as well as his continued pursuit of service and the arts were a recurring pattern throughout his life. Ed was self-sufficient yet always found time to help those in need.
Ed graduated in four years but stayed for a fifth to study mathematics before enrolling at the University of Colorado for a master’s in physics and mathematics. After he graduated with his master’s in 1924, one of his instructors encouraged him to continue with his studies, perhaps at Yale. He moved to New Haven, where, three years later, he would earn his PhD in mathematical physics, the basis of probabilities and statistics and the backbone of non-deterministic thinking.
The 1920s were an exciting time for scientific discovery. The year Ed received his PhD, the fifth Solvay Conference on Physics was held. The subject was electrons and photons. In attendance were some of the most famous names in science to this day, including Marie Curie, Edwin Schrödinger, Max Planck, Albert Einstein, Niels Bohr, and Werner Heisenberg. The conference spawned an explosion in scientific thought and discovery based on non-determinism.
Non-determinism played a crucial role in shaping Deming’s worldview and began to lay the foundations for his System of Profound Knowledge. For one, it taught him that long-established and long-held beliefs weren’t necessarily true; the entire structure of the physical world was being rethought and reexamined.
Second, it showed him that the underpinnings of our very existence are random. That idea of randomness would be born out through his fascination with statistics, which in turn would inform his understanding of variation (the second element in the System of Profound Knowledge).
Third, it taught him to look beyond black-and-white cause and effect. It forced him to look at problems as multifaceted, complex systems, where changing one factor might have far-reaching, and unintended, consequences. This was the beginnings of his understanding of the fourth element of Profound Knowledge: Systems Thinking.
But before we go forward, we need to rewind briefly. During the two summers bracketing Yale’s academic calendar, Ed Deming—a university faculty member with a bachelor’s in engineering, a master’s in physics and was working on a PhD—supported himself and his wife (he’d married a schoolteacher, Agnes, in 1923), as ever, by working.
Then, Ed took an internship in a Chicago sweatshop: Hawthorne Works.
Dening's Journey to Profound Knowlledge - How Deming Helped Win a War, Altered the Face of Industry, and Holds the Key to Our Future - Part 1 - Chapter 2: The Jungle in Paradise
All that remains of one of the greatest industrial sites in US history is a stone tower at the corner of Cicero Avenue and Cermak Road just outside Chicago. Few realize its “story is nothing less than the story of the rise and fall of urban industrial America in the twentieth century.”
In the early 1900s, Hawthorne Works was a large factory complex of the Western Electric Company, producing large quantities of telephone equipment. But it was also the Silicon Valley of its time, a hub of innovation, the home of cutting-edge technology, and the object of national fascination. Hawthorne Works played a crucial role in the history of manufacturing as well as in Deming’s own development, shaping the foundation of his ideas that would, decades later, change the world.
Hard work was a fact of life in Rose Cihlar’s immigrant family. Although she was born in Chicago in 1903, her parents were born in Bohemia (before the region became part of the Czech Republic) and immigrated to the US shortly before the turn of the century. They settled down in the Czech-Slovak immigrant community outside Chicago in Hawthorne (now swallowed up by the township of Cicero).
We don’t know when Rose began working, but we do know that in 1919 (making her sixteen years old at the time) Rose Cihlar worked as an assembly line inspector at a nearby factory. Just down the road sat Chicago’s famous meatpacking district, the subject of Upton Sinclair’s The Jungle, published thirteen years prior, which exposed the harsh working conditions and unsanitary environment of such sweatshops.
Hawthorne Works encompassed one hundred buildings and stretched over two hundred acres. It contained over five million square feet of workspace and was known as the Electrical Capital of America. By the time Rose worked there, Hawthorne had become the center of the next great technological advancement: the telephone. While the discovery of electricity a hundred years earlier was seen as magical, the invention of telephony was seen as close to miraculous. Sure, lightbulbs were an upgrade from candles, but a telephone . . . well, there had never been anything like it. Before the telephone, if a beloved aunt went to live with family out West, you may not ever see or hear from her again; it wasn’t called the Wild West for nothing. You could write letters, but that was it. And it could take months for the letters to travel back and forth, or they might just get lost along the way.
The telephone, on the other hand, made it possible to pick up a device and hear your aunt’s voice instantly. You could have a conversation as if she were sitting across the table from you sharing a pot of coffee. It may seem trivial to us today, but it was nearly unimaginable for the average person at the time. Today, the city of Seattle stands for tech and coffee. Wall Street stands for finance. LA means movies. In the early 1900s, Pittsburgh as well as Gary, Indiana, stood for steel. Detroit was “Motor City.” And Hawthorne, Illinois, meant telephones. In many ways, Hawthorne looked like a company town, like those of Henry Ford’s car factories or Milton Hershey’s chocolate factory. In the modern era, Phillips Petroleum had Bartlesville, Oklahoma. When I worked as a programmer at Exxon in Houston in the eighties, the company seriously considered turning Conroe, Texas, into a company town. All workers who weren’t physically needed at the pipelines and plants would be relocated to Conroe, complete with housing developments and planned communities. Hawthorne had its own power plant, hospital, fire department, trolley line, etc.
Unlike the typical factory town, where every aspect of the workers’ lives was controlled by the company, at Hawthorne Works everything inside the factory belonged to Western Electric, the manufacturing division of American Telephone & Telegraph (AT&T), but everything outside the factory was privately owned, privately financed, and privately organized. The employees even had their own sports teams, allowing Hawthorne Works to sponsor them. More importantly for Rose Cihlar, employees created their own savings and loans clubs to lend money to their peers to build their own homes and buy their own cars, allowing them to build personal wealth. No doubt this played an important role in Rose’s ability to later send her son, Gene Cernan, to Purdue. Gene would go on to become commander of the final Apollo mission to the moon.
In the typical company town, employees didn’t have power over their own lives and future. Workers were more or less dependent family members of a massive family business. A few patriarchs at the top dictated the lives of everyone else.
The difference at Hawthorne arose from Western Electric’s approach to its workers, an approach that was considered revolutionary at the time. Workers got not only vacations but paid vacations, not to mention retirement planning and company pensions. In many ways, the company treated its workers more like partners than peasants. It was a beautiful social experiment. And it worked.
According to one source, Year after year, Hawthorne’s workers turned out an endless stream of complex communications apparatus, engineered by the sharpest minds in the field and assembled by skilled craftsmen. . . . In its time, Hawthorne Works exemplified the “virtuous circle”: a win-win proposition whereby corporate success forged a bond of loyalty with its employees.
There was a sense of community and identity. Employees didn’t merely work on assembly lines. They built the telephones that connected the nation. The factory existed for a decade before the first successful transcontinental phone call was made between San Francisco and New York in 1915. The workers of Hawthorne understood the significance of the work that came out of their factory—and they were a part of it. And so too was W. Edwards Deming.
Deming experienced this esprit de corps firsthand while he interned at Hawthorne Works during the summers of 1925 and 1926. Though he wouldn’t fully appreciate it until after he left, the factory was a testing ground for his Theory of Knowledge (one of the four elements of his System of Profound Knowledge) and was led by the creator behind the Theory of Variation. Crucially, his time at Hawthorne gave Ed an appreciation for how human psychology affected a system.
The entire operation at Hawthorne was a masterwork of systems thinking. Without his internship at Hawthorne, who knows how different history would have been? Decades later, he would find the same kind of relationship between Japanese companies and their workers. There was a profound sense of pride. The workers at Toyota weren’t making just rivets and welds but the cars their neighbors drove. Products went out into the world representing Japan and helping to rebuild the nation. Japanese workers believed they were doing something that mattered. Hawthorne was the seedbed for Deming’s understanding of Profound Knowledge.
Previous to Hawthorne, management styles were largely predicated on Tay- lorism and Fordism. By the time Deming came to Hawthorne, both Fredrick Winslow Taylor and Henry Ford had left an indelible imprint on how to manage workers. Ford’s genius wasn’t the automobile (that had already been invented) but rather the efficient assembly line. He spent countless hours creating production systems and then endlessly improving them. When he first began to make cars, it took a bevy of specialized craftsmen half a day. When he opened his new factory in 1913, it took only ninety minutes to create a Model T. The drawback for workers was that Ford saw them as inconvenient cogs in the machine. He sought to standardize operations to the point that a worker could be as interchangeable as any other piece of the system. Where Ford’s approach was driven by practical matters, Taylor’s approach was more scientific in nature, giving rise to the term “scientific management.”
In layperson’s terms, where Ford treated people like cogs in a machine, Taylor approached workers as if they were machines themselves—machines that could be optimized for maximum efficiency, given the right physical and psychological conditions.
Fordism and Taylorism were the mainstays of American management throughout the twentieth century. But at Hawthorne, Fordism and Taylorism found their first challengers. Beyond being an impressive industrial site, Hawthorne Works became a lab of sorts. “The Works’ bustling shops provided the perfect setting for testing new manufacturing methods, and company officials gladly served up employees as subjects for groundbreaking studies.”
That is, the workers became lab rats. Psychologist Elton Mayo conducted a social experiment at Hawthorne Works from 1924 to 1927 to prove the importance of people on productivity—not machines.
His social experiment measured the change in workers’ output at different levels of lighting. He found that any change in lighting increased employee productivity. However, he later discovered that the rise in output came from workers knowing they were being closely watched, not from how much light they had available. This discovery was dubbed the Hawthorne Effect, the act of subjects changing their behavior in response to being observed.
Two additional studies, the relay-assembly tests and the bank-wiring tests, followed Mayo’s illumination tests. Altogether, the studies assumed the label “Hawthorne experiments” and became the basis for the school of human relations. Deming referred to Ford and Taylor’s influence as “living in prison under the tyranny of the prevailing style of management.”
Today, we have shifted into the Knowledge Economy, where the most prized skills are innovation and creativity—the antithesis of Ford’s approach to management and a fundamentally different perspective than Taylor’s. And yet, the effects of Fordism and Taylorism can still be seen everywhere. To be blunt, this perspective is based on the idea that workers don’t want to work. That given the opportunity, they will shirk as much as possible and be as lazy as they can. There’s an assumption of underlying antagonism between “them” (the workers) and “us” (the managers).
In the middle of Mayo’s Hawthorne Effect studies, Ed rolled into town. He spent months researching electrical transistors. And that time left a lasting mark on Ed, ultimately leading to one of his four elements of Profound Knowledge, that of human psychology and motivation. His future views on management would stand in direct opposition to the methods of Ford and Taylor, providing an alternative to the standard way business “had always been done” in that time.
I imagine Deming chose his summer job for the same reason a college kid might intern for free in Silicon Valley: to have a front-row seat at the cutting edge of innovation, landing a role in research and development. While his Ivy League education may have prepared him to appreciate and absorb the management concepts floating around Hawthorne, it was his rural raising that prepared him for the on-the-ground reality and allowed him to empathize with the harsh conditions of the working-class people all around him. The fact that Hawthorne was the foremost industrial site in the nation didn’t change the horrendous working conditions endured by the thousands of people employed in the factories. It was still a hardscrabble life, just on an industrial scale.
By the time Deming stepped off the trolley line in 1925, Hawthorne Works employed around forty-thousand people, mostly women. A friend of his had forewarned him to “stay well away from the stairway when the whistle blew at the end of the day. ‘Those women will trample you to death. There won’t even be an oil slick.’”
Later, Deming would reflect, “It was hot. It was dirty. No wonder they wanted to get out.”
Ed held a particularly low view of a mainstay of American factories at the time: piecework, where a worker got paid according to the number of units produced or tasks performed. Knowing Hawthorne operated this way, it’s likely Rose was paid according to how many telephone assemblages she inspected.
This type of pay scheme incentivized workers to focus on quantity, not quality. It is harder to take pride in workmanship if you knew that each person working on the unit before you got to it did a rush job. No wonder managers working under Taylorism were so suspicious and antagonistic toward the line workers. Deming would later observe, “Piecework is man’s lowest degradation.”
Although Hawthorne workers took pride in the bigger picture, they still operated under a system where shoddy workmanship was incentivized from the beginning. Ever the philomath, Ed was curious about everything around him. It was his good fortune to wind up at Hawthorne Works, where he could be exposed to the latest in production processes, the social experiment that was the town of Hawthorne, and the production and management experiments that were being conducted by Mayo and others. And I cannot imagine that he worked there for months without indirectly coming across the work of Dr. Walter Shewhart. Unbeknownst to Ed, his relationship with the AT&T physicist researcher would become a defining factor in his life, as we’ll soon see.
Ed’s time at Hawthorne Works exposed him to new ideas about manufacturing and labor. Workers’ autonomy outside of the factory led to a community not dictated by company patronage but one led by the community itself. The result was independent-minded individuals like Rose Cihlar. Without her experience at Hawthorne Works, I don’t know that she’d go on to work for an electrical manufacturer as a married woman with children in those early decades when women were still expected to adhere to their traditional domestic roles. Without her own income, her son wouldn’t have gone to Purdue nor started down the path to becoming the commander of the Apollo 17 mission.
Hawthorne Works’ progressive arrangement prepared Ed to appreciate how companies and their employees could work together for the common good. When he landed in Japan decades later, he, more than any of the other Americans there, immediately grasped how crucial the special arrangement between Japanese companies and their employees was and why it led to superior quality. This line of thinking would become what I believe is the striking difference between the System of Profound Knowledge and Western management: Ed discovered a human-centered approach to systems, in general, and business, in particular.
But before he could begin to fully articulate his System of Profound Knowledge, he had to learn from the master of variation . . . and to understand variation, we have to first appreciate the history of quality control.
Deming’s Journey to Profound Knowledge - How Deming Helped Win a War, Altered the Face of Industry, and Holds the Key to Our Future - Part 1 - Chapter 3: The Birth of Quality Control and Standardization
Violins and violas, in varying stages of progress, hang from the rafters, drying. On the workshop floor, apprentices toil away at their workbenches. An older man unscrews wood furlings from the cello he’s crafting. Behind him, a younger man intently files the scroll piece of a viola.
Master Stradivari comes into the workshop and begins to inspect their work. He strolls over to a young man carefully sanding the upper bout of a nearly finished violin. The young man lays down his sandpaper and steps back with his head down and his hands clasped. Stradivari carefully picks up the instrument to look it over.
“Very good,” he says. “Exquisitely worked. You’ve crafted a jewel, my boy.” He takes a step as he continues. “Perfect for a courtesan or a priest to pluck after supper or polish Sundays after mass. In other words, . . .”—his tenor and countenance change as his gaze moves from the instrument to the young man’s face, holding out the violin as if he were presenting it to the apprentice—“. . . this violin will never bear my name.”
He spins around and—WHAM!—slams the violin against the bench, breaking it into splinters.
Stradivari shouts in the young man’s face, “Put your anger into your work, boy!” Then he angrily strides out of the workshop. As he does so, he shouts, “Stay with me and learn!”
This scene from the movie The Red Violin is, more or less, the story of tools, quality, and humankind for the last three million years. One person learned how to make a certain type of tool from “a master.” Then they made said tools one at a time by hand. A potter made one pot at a time. One blacksmith made one plow. One cooper, one barrel. One luthier, one violin.
The quality of anything humans made, by and large, depended on the skill of the craftsman who made it. Everything built, crafted, or made was unique. Craftsmen might get decently good at consistently churning out high-caliber products, but each one was still one of a kind. And by and large the quality of anything humankind made depended on the skill of the craftsman who made it.
The eternal question of quality has always been this: “How good is good enough?” Carving a walking stick to hike the Appalachian Trail? Quality isn’t much of an issue. Carving the wood for a Stradivarius? Nothing less than absolute perfection will do.
If you were a soldier on the battlefield, you prayed that your new sword wasn’t made by the village blacksmith when he was blackout drunk. If there were barbarians at the gate, a feudal lord might have a moment of panic remembering that he’d gone with the lowest bidder to build said gates.
History records a few times when we progressed from the craftsman model of production. About six hundred years before and a hundred and forty miles east of Stradivari’s workshop, the city-state of Venice created a massive assembly line called the Venetian Arsenal.
The shipyard there could assemble an entire seaworthy vessel from prefabricated pieces in as little as a day. (I doubt the shipwrights in Venice knew it, but they had rediscovered an idea from over a thousand years earlier and a thousand miles south, when Carthage used the ship assembly-line method during the First Punic War with the Romans.)
Around roughly the same time, the Chinese state of Qin mass-produced some pieces of a crossbow, which played a role in conquering their neighbors and establishing the first Chinese imperial dynasty.
But outside of a handful of examples like these, basically everything we made for thousands of years was one at a time. Stradivari’s workshop was the story of humanity’s progress and civilizations’ development. But the history of production and quality rounded a corner thanks to Thomas Jefferson passing around a pamphlet—the result of which would come to necessitate Walter Shewhart’s statistical process control and the theory of variation.
In 1785, the United States had been a country for only nine years. Jefferson wouldn’t become president for another sixteen years. In the meantime, he was ambassador to France, where he met a gunsmith named Honoré Blanc. Blanc had unknowingly copied the Qin of China: weapons that used interchangeable parts.
With Blanc’s invention, if the flintlock of your musket broke, you didn’t need to return it to a gunsmith to handcraft another one. Instead, you could pick up a new flintlock from a pile of parts and be back in the fray before you could say, “Wait, tell me again why I’m dying for some schmuck I don’t know?”
Jefferson saw the importance of the invention but couldn’t convince Blanc to move from France to America, which was still being carved out of the wilds. Instead, Jefferson wrote to the first Secretary of War, General Henry Knox, explaining Blanc’s ingenious system and urging its adoption.
In 1798, some ten years later, the US government granted a contract to Eli Whitney to manufacture ten to fifteen thousand muskets. (This was some- what ironic, seeing as how he’d never created a musket in his life.) About ten months into Whitney’s government contract, the secretary of the treasury sent him a “foreign pamphlet on arm manufacturing techniques”—almost certainly French and, therefore, almost certainly Honoré Blanc’s. By 1801, Whitney had not only missed the contractual deadline but the quantity as well. By a magnitude of a thousand. However, with the ten guns he had, he demonstrated to Congress that the parts from any one musket could be switched out with another. If the gun broke, the Army wouldn’t have to buy a whole new gun—just a replacement part. The legislators quickly mandated that all such equipment be standardized.
I imagine he failed to mention that it took more money to manufacture ten muskets with interchangeable parts than it did for a gunsmith to craft ten muskets. It did, however, buy him more time and earn him political support.
While Whitney didn’t invent interchangeable parts, he did a successful job of evangelizing the idea. More and more companies, and especially armories, began implementing the idea during the 1800s. On top of this, US manufacturers began shifting from hand labor to relying more heavily on mechanization.
They went from skilled craftsmen using hand tools to semiskilled laborers operating machines. These two developments led to what came to be known as the American System. While the Industrial Revolution had already begun in Britain, the American System was a profound evolution in industrial development. By 1880, the US, Europe, and elsewhere had entered what historians term the Machine Age. While these same historians might point to interchangeable parts as a key development, I think they’re missing the point. Interchangeable parts were the result, but standardization was the catalyst.
To illustrate my point, consider the ubiquitous cargo ship container. It doesn’t matter whether you’re in New Orleans, New Plymouth, or Newfoundland, they look exactly the same. However, it didn’t used to be that way. As The Box: How the Shipping Container Made the World Smaller and the World Economy Bigger so insightfully reveals, before standardizing the size of containers, maritime trade vessels had to load cargo by hand.
Crates of fruit, trunks of clothes, sacks of potatoes, individual automobiles, raw lumber—you shipped it however you wanted to. It was up to the longshoremen on the piers to figure out how to most efficiently load it onto ships. And each ship was different. Your goods might arrive in the hold of a cruise liner or belowdecks of a barely seaworthy trawler.
Once the world settled on a standard cargo container size, everything could be planned for. Ships were built to hold the exact dimensions. Cranes could be computerized because they needed to work with only one type of container. Rates could be standardized because transporters knew the dimensions of their loads; cost became a simple matter of weight, distance, and priority.
Trucks and trains could be configured to all carry the same size box, meaning you could load a container of your products onto a railcar, see it lifted onto the bed of a tractor-trailer, set on a ship, unloaded, and transported to your customer . . . without ever opening its doors. Maritime trade went from relying on specialized skill sets to standardized processes—making everything far, far cheaper to transport and trade.
When I worked at the IT company Docker, The Box and its underlying principles were the founder’s bible. By standardizing the way we created data servers, we could format thousands of servers in the same amount of time it would take to conventionally format one.
Standardization: That’s what changed the world. That’s what spurred the Machine Age and everything that came after. Factories standardized their products and processes. Manufacturing quality had evolved from one craftsman’s skill to an era of standardization. Standardization improved production, but despite all the technological innovations and progress, production still hinged on the same problem Stradivari had: sometimes the product didn’t come out right. You’d have to scrap the whole thing and start over. Interchangeable parts were, in fact, the turning point in the history of quality control and led to the theory of variation.
But even with interchangeable parts, producers soon discovered that exact specifications were unrealistic. No matter how precise the machines and the processes, the outputs all slightly varied from each other. This spawned a need to allow for variance in product specifications. Think of the notion of an exact fit as a deterministic approach (as we learned about in Chapter 1). Specifications that allow for a certain variance—or tolerance—are more in line with a non-deterministic approach.
The people in charge of a manufacturing process had to decide the limits of what was acceptable. How much variation would they allow or tolerate in the finished products? They called these “tolerance limits.” Industrial producers switched from trying to achieve an exact fit to allowing products to be manufactured within certain tolerance limits. In the beginning, these limits were simply named “go/no-go.”
Back at Hawthorne Works, this was essentially Rose Cihlar’s job. She was a quality inspector. It was her job to act like Stradivari. As telephone systems rolled off the assembly line, Rose carefully inspected each one. Unlike Stradivari, she had some tools to test the specification tolerance. If the telephone fell within those tolerance limits, it was a “go.” Otherwise, she marked it as a defect—a “no-go”—and tossed it in the reject bin. Out of the forty thousand or so workers at Hawthorne, five thousand of them were inspectors like Rose, inspecting and rejecting all day long. Over one hundred thousand individual parts and pieces composing the individual telephone were scrapped just like that. While the factory manufactured telephones, its second-biggest output was scrap.
With Stradivarius violins, each one was the work of a master craftsman. Master Stradivari allowed no defects for instruments coming out of his workshop. However, the idea of zero defects is realistically impossible. No two violins are exactly the same. Each one has some flaw, no matter how tiny and insignificant.
Now, imagine a huge factory making one hundred thousand different components to be assembled into one telephone. Imagine mass manufacturing thousands of telephones. That means millions of factory systems and processes. The idea of creating every single component with absolute perfection is ludicrous. Stuff happens. Every single finished product slightly differs from all the others. As Deming noted in his final years, variation is a part of life.
Imagine the cannons on a pirate ship. When iron foundries first made cannons, they would create a vertical clay mold. To make the shape of the cannon’s cylinder, they would create the mold of a long shaft. Next, they stood a long piece of clay in the middle of the shaft. (Otherwise, the result would be a thick iron rod.) They’d pour the iron into the mold and let it cool. Then, they’d break all the clay out of the middle of the shaft. The result was a cylinder.
As you might imagine, this was not a precise process. Sometimes, workers would make the clay column a little too thick, resulting in thinner cannon walls. Too thin and the cannon could explode like a bomb, killing the pirates and maybe sinking the ship. Sometimes workers would make the clay column too thin, resulting in thicker cannon walls. Too thick and it wouldn’t hold any cannonballs. It was just an expensive piece of useless metal. The foundry manager had to decide how thick was too thick and how thin was too thin. These tolerance limits dictated if the cannon was a go or a no-go. Despite these limits, there was still considerable variation between cannons.
For the sake of explanation, let’s say the “perfect” size for the mouth of the cannon was 84 mm across. But because nothing was ever perfect, the foundry owner would allow anything between 83 mm to 85 mm to pass. Anything inside that range was a go; anything outside was a no-go and sent to the scrap heap.
A Swiss engineer in the 1700s named Jean Maritz came up with a better way to manufacture cannons. He did away with the interior clay column altogether and forged what was essentially a huge, thick iron rod, then used a drill to bore out the inside. His method resulted in much more precise cannon sizes.
Continuing our example, let’s say his way resulted in less variation. The cannons’ mouths might vary between 83.5 mm and 84.5 mm. Less variation meant a “tighter” tolerance limit. The higher the cannon’s quality, the more effective its range and accuracy. The tighter a cannonball fit in a cannon’s mouth, the less air could get around the cannonball. The less air, the more explosive the force of the gunpowder on the cannonball, allowing the cannon to shoot farther.
Growing up poor as he did, I imagine frugality was almost in Ed’s DNA. The sheer amount of waste generated at Hawthorne Works must have boggled his mind. Surely, there was a way to improve this.
From his background in non-determinism, Ed understood that randomness and variation are simply facts of life . . . even in standardized manufacturing processes. He must have mulled for hours on how to find the solution to process deviations and defects inherent to the operations of Hawthorne.
As he came to find out, the answer lies with mathematics and statistics. Little did Deming know that he was about to get a front-row seat to the next turning point in the history of quality in the form of Dr. Walter Shewhart.
Deming’s Journey to Profound Knowledge - How Deming Helped Win a War, Altered the Face of Industry, and Holds the Key to Our Future - Part 1 - Chapter 4: The Root of All Evil
To understand Shewhart, we must first understand the theory of variation, and to understand variation, we must not only understand the history of quality but the history of measurement and probability as well.
Take, for instance, the time during the Peloponnesian War. Sparta besieged the city-state of Plataea. Finding themselves at an impasse, the Spartans built what amounted to a containment wall around the city. They stationed a few guards, and the rest went home. At some point, desperation, starvation, or deprivation would open the city gates; the Spartans simply needed to wait them out.
If necessity is the mother of invention, desperation is its muse. A soothsayer and a general hatched a bold plan: build ladders to scale the walls under the cover of night and escape between the enemy’s encampments. After making the decision, the only real question was: how tall did the ladders need to be?
Since the Plataeans couldn’t exactly climb to the top of the walls and use a tape measure in one hand while fending off Spartans with the other, they devised another means. They counted the bricks from afar.
According to Thucydides’ History of the Peloponnesian War, one side of the Spartans’ wall was. . . facing the town, at a place where the wall had accidentally not been plastered. A great many counted at once, and, although some might make mistakes, the calculation would be oftener right than wrong; for they repeated the process again and again. . . . In this manner they ascertained the proper length of the ladders . . . It was a narrow escape, but they pulled it off.
Saved by standardization. The bricks were all made of a common—a.k.a. standard-ish—size. All they needed to do was count how many brick layers there were and multiply that by the height of an everyday brick. Humans being human, though, not everyone agreed on how many layers of bricks there were. So, they had a lot of people count. Out of ten people, let’s say that six counted forty layers and two of them counted forty-one. Guy number nine was dead sure there were only thirty-nine layers. And the tenth guy reported fifty-nine, though he sort of slurred his words as he did.
What number should the ladder makers use? Thirty-nine? Forty? Forty-one? Fifty-nine? All of them couldn’t be right, of course; three must be in error. The Plataeans could have sent out even more people to count the layers, but at some point, the Spartans would start getting suspicious. They’d have to use the numbers they had. If they built the ladders too long, it’d mean more men to carry each one, making it harder to be stealthy. If they built them too short, well, what was the point? There was no certainty. It came down to a question of confidence: how sure were the would-be escapees in each of the four numbers?
They had zero confidence in the guy who counted fifty-nine bricks high. Nobody else counted anything near his number, and it sounded like he’d found the last bottle of wine in town, anyway. Thirty-nine was in the ballpark, but only one person got that number. In the realm of “how good is good enough,” they’d much rather err on the side of caution. The two people saying it was forty-one were known to have excellent eyesight, and most of the others counted forty layers, anyway. Everyone could feel fairly confident that the wall was somewhere around forty to forty-one bricks high. Whichever number they settled on, their estimates were good enough to get them over the wall.
In the end, out of the two hundred and twenty who made the attempt, two hundred and twelve escaped. This is statistics, variance, and probability in a nutshell (obviously, it’s more complicated than that). It’s not about certainty; that’s determinism. One of the linchpins of Deming’s System of Profound Knowledge is understanding uncertainty; that is, applying statistics to variation (which in this case would be the differences between how many bricks each person counted). This allows us to quantify certainty versus uncertainty.
Put another way, statistics is about how confident you feel when dealing with uncertainty. It’s about how probable an outcome is. If you let go of an apple, there is an extremely high probability it will hit the floor. If you pick any American, there’s a 50% chance they earn below the median income. If you pick a spot on the globe at random, there’s a 71% chance you’ll hit water. It’s all about chance and probabilities.
Deming gives us a great, simple example of variability. If you ask three people to count the number of people in the room, you might get three different answers. The answers depend on each counter’s definition of “the room.” Should the count include the people serving food or be limited to the guests? Should it include the open patio attached to the room?
Nobel Prize–winning physicist Percy Williams Bridgman was also concerned with variation when creating synthetic diamonds using extreme pressures. His gauges kept breaking down when they worked under extreme pressure, so he had no idea what pressure levels he had reached. This work led him to describe a general philosophical doctrine called operationalism. It is based on the idea that we can know the meaning of something only if we have a way of measuring it. In 1927, Bridgman published The Logic of Modern Physics, examining how scientists define measurements. This work later inspired Shewhart’s and Deming’s ideas around what Bridgman coined an operational definition. Deming defined an operational definition in his book The New Economics as a procedure agreed upon to translate a concept into a precise measurement.
Operational definition became a key component of Deming’s theories of knowledge and variation.
While Ed was interning at Hawthorne Works, a man at the research arm of Western Electric was grappling with how to minimize manufacturing defects. Dr. Walter Shewhart believed there had to be a more economical way than simply standardizing production and using go/no-go tolerance limits. He wanted to minimize the variation between each telephone made. Being a physicist, his instinct was to try to solve the problem mathematically. Whereas basic statistics uses overall averages, he used a form of statistics that specifically averages the variation of defects—known as standard deviation. While standard deviation had been historically used in some areas of science, particularly with non-determinism, he was the first to apply non-deterministic methods to manufacturing. Curiously enough, standard deviation came about, in part, from counting stars.
At its core, astronomy has always been about measuring. When do certain planetary bodies appear and disappear? What’s the distance from the horizon to this constellation, and how does that change over the four seasons? How long is daylight throughout the year? For that matter, what’s a year? Thousands of observations by amateurs and professionals alike, all from different parts of the world, all writing down everything by hand—can you imagine?
Astronomy began making great strides during the Scientific Revolution. In 1543, Copernicus published his observations supporting his theory that the solar system revolves around the sun (not the Earth).
In the early 1600s, Johannes Kepler wrote the laws of planetary motion. Based on his observations, Kepler noticed a pattern in planets’ sizes and movements across the sky. He spent years coming up with a mathematical formula that could explain why planets behaved the way they did. He wanted a perfect formula but would settle for whatever best fit the measurements he had. He had assumed that early astronomers had dismissed the idea of elliptical orbits; for years, he’d imagined that planets traveled in a perfect circle around the sun.
No matter how hard he squeezed, the data told him he was wrong. He tried an egg-shaped orbit, which didn’t work. Since circles and eggs weren’t the answer, he finally tried an ellipse. It fit. Not perfectly, mind you . . . but good enough.
Armed with this knowledge, he flipped things on their head. He had reverse-engineered the formula from the observations. Now that he had the formula, he used it to predict where other planets in the solar system would be—from observed measurements to a formula that best fit the data. From a best-fit formula to predicting data—statistics at its finest. Armed with this information, Kepler believed there should be a planet between Mars and Jupiter. He was wrong . . . but he was kind of right too.
One hundred and forty years after Kepler’s death, another astronomer used the additional century’s worth of data and developments to create his own predictive formula, dubbed the Titius-Bode law. It, too, predicted a planet between Mars and Jupiter, as well as predicting the distance of a planet beyond Saturn.
Lo and behold, in 1781 astronomers found a planet beyond Saturn: Uranus.
The Titius-Bode law gained credibility. Soon, astronomers were searching for the “lost” planet between Mars and Jupiter. After conversing with the discoverer of Uranus, a Hungarian astronomer assembled a crack team of twenty-four fellow astronomers. Their purpose: to coordinate their efforts to systematically search for it. He dubbed them the Celestial Police.
One of these astronomers was an Italian priest in Palermo named Giuseppe Piazzi. As fate would have it, he had already discovered the lost planet before his invitation to join the Celestial Police arrived in the mail. As irony would have it, he discovered it by accident.
That New Year’s Day in 1801, Piazzi didn’t know what he was looking at. It looked like a comet, but it acted like a planet. He made his observations for a little over a month before he became ill. By the time he’d sent notes to other astronomers who could observe the unidentified orbiting object, it had been obscured by the sun. By the end of the year, Piazzi knew it would have almost completed its orbit and should be visible again . . . but he couldn’t find it. He’d only been able to make twenty-four observations—not enough to predict where it would reappear.
A twenty-four-year-old German child prodigy decided to tackle the problem of how to use scant data to create a formula to predict where this thing should be. In three months, Carl Friedrich Gauss developed “Gauss’s method.” (With just a bit of tweaking, Gauss’s method would later become the basis for global positioning systems, or GPS.) Using Gauss’s method, the Hungarian chief of the Celestial Police spotted the dwarf planet Ceres, the largest object in our solar system’s main asteroid belt between Mars and Jupiter, just a few days shy of the one-year anniversary of Piazzi’s first observation.
Six years later, Gauss abandoned pure mathematics (he didn’t think it was worth it, apparently) and became an astronomer himself. While pursuing astronomical calculations (literally), Gauss laid the foundation for the statistics we’re talking about here.
This was Gauss's way of quantifying the randomness of his and other astronomers’ observations. With the Plataeans’ ladders, being a foot off was good enough. When dealing with pinpricks in the night sky that constantly change position, he needed to be a lot more precise. Gauss needed to remove as many measurement errors as possible. His method was essentially a way to quantify what the Plataeans did instinctively. He called these variations the mean error; the statistician Karl Pearson would come to refer to this as standard deviation—a core principle Shewhart used to create Statistical Process Control.
It’s eerie how much of Walter Shewhart’s life echoes Deming’s (or maybe the other way around). He was born nine years before Deming on a rural farm in western Illinois. In 1910, he enrolled at the University of Illinois at Urbana-Champaign. He earned his doctorate in physics there in 1917.
He taught physics in Wisconsin, but that changed when Congress voted to enter World War I after the sinking of the Lusitania. The US Army, Navy, and budding Air Force placed $22 million worth of orders—about $500 million today—for communications equipment with Western Electric. Spurred by a sense of patriotism, Shewhart left academia to join the company’s engineering department. He moved his family to Brooklyn and commuted to his office at Western Electric’s building at 463 West Street overlooking the Hudson. Of course, the vast majority of Western Electric’s manufacturing took place at Hawthorne Works. Sometime between the end of the World War I and 1922, Shewhart became interested (some might say obsessed) with improving production quality. He was convinced that he could take the mathematical statistics concepts he’d learned as a physicist and apply them to production processes. On a Friday in May 1924, he summarized some of his ideas in a one-page memo to his boss, George Edwards.
As Isaac Asimov is attributed as saying, “The most exciting phrase to hear in science, the one that heralds new discoveries, is not ‘Eureka!’ (I found it!) but ‘That’s funny . . . ’” That’s how I feel about Shewhart’s memo. It should have been a eureka moment. Instead, there was no fanfare. No ticker tape parade.
Nobody knew the history of manufacturing had made one of its most important turning points.
Shewhart created a statistical system to improve production quality. He took the idea of tolerance limits and flipped it on its head with statistics. Instead of looking at telephones as good enough or not—that black-and-white deterministic thinking the world was used to—Shewhart quantified the variation.
He could now tell the managers at Hawthorne what percentage of products fell statistically within their tolerance limits. Far more importantly, it gave managers a method to track variation. If you can track variation, then you can trace variation to better understand why a production line creates defects and detects them much earlier in the process.
In essence, Shewhart created a way to continually improve any manufacturing line. This was the first time factory managers had been given a tool to let them manage the uncertainty in production.
His method, Statistical Process Control, let managers compare variation across workers and machines. The more managers could find and fix the causes behind factory defects, the more products they could produce that fell within their tolerance limits, thus improving their overall quality. And higher quality meant less waste; less waste allowed manufacturers to do more and more with less and less.
Think of an automaker doing a recall because of a braking issue. Tens of thousands of cars all have to be brought into their local dealerships to get their brakes fixed, costing the car company a fortune. In general, it’s cheaper to produce higher-quality brakes in the first place than to fix a mistake after the fact.
Shewhart’s genius was taking statistics from academia—physics, astronomy, biology, etc.—and applying it to Hawthorne’s assembly lines. He wanted to cut down on how much Western Electric wasted. For the first time in history, we had finally stepped beyond a simple go/no-go approach to making things.
Tracking cold, hard numbers allowed managers to track patterns in the data. Shewhart’s method was a paradigm shift. It was the first example of anti-Taylorism, where using math and statistics enabled management to see defects as results of the process instead of the workers. Before this, most managers viewed their employees like Stradivari did his apprentices. If the product was bad, it must be the workers’ fault. Stradivari never stopped to consider whether the woodcutter had sold him wood from a diseased tree. Never stopped to see if the apprentices’ tools were sufficient. Never once considered that he himself might be a poor teacher. His reaction was to blame the worker.
By 1929, Shewhart had formalized this new method of tracking and tracing variation. Basically, he applied the scientific method to manufacturing. Before this, manufacturing was a linear process. You figured out what you wanted and how many. You made them. Then you inspected and threw away the defects. Shewhart turned this into a cycle, what Ed would later call the Shewhart Cycle: Figure out what you want, make it, inspect it, figure out what caused the defects, go fix it, and then go through the whole cycle again, using feedback from your mistakes to continuously improve production quality.
Even after Deming tweaked the Shewhart cycle, he still referred to it throughout his life as the Shewhart wheel. Despite this, his students in Japan called it the Deming cycle. Today, you might recognize it as the “plan, do, check, act” method, or simply the PDCA cycle.
In his later years, however, Deming came to rename check as study. To his way of thinking, check was too much like the go/no-go inspection process of checking manufactured products. He believed the better term was study, which implied approaching the results with a scientific curiosity to investigate and understand why things turned out the way they did.
Shewhart classified defects as being caused by one of two things. The first was chance, what Deming would later call common cause. These were variations that could be predicted and should be planned for. The second was assignable, or what Deming would call special cause, causes that couldn’t be predicted and shouldn’t be planned for.
Variation always occurs in all processes. For Shewhart, as long as the variance fell within standard-deviation limits, the variance was inherent to the manufacturing process (i.e., assignable or common-cause variation). Sometimes the variation’s cause was an outlier, like an employee not being trained well or a machine that broke. That’s not something Hawthorne’s managers would have predicted. These special cases or anomalies are not part of normal or standard operations; hence, chance or special-cause variation.
The real value of Statistical Process Control is that it allows you to observe variation and look at random versus non-random patterns. A random pattern represents a stable process, a.k.a. a process “under control.” A non-random pattern is a useful predictor of potential defects, signaling an amount of uncertainty in the process. And here is the root of all evil: misidentifying variation.
Let me illustrate. Say I have an iPhone app for my thermostat. I like to keep my home at 70°F for the dogs when I’m away for the day. I expect the temperature to vary from 68°F to 75°F throughout the day. This is normal or common-cause variation; it’s to be expected. I don’t need to worry or intervene.
However, 80°F would be problematic for my pets. If I noticed the temperature on my iPhone app trending upward from 68°F to 72°F to 74°F—that is, a non-random pattern—I might suspect that something was off. Maybe my A/C is on the fritz. This might signal a problem leading to a special cause, meaning I might need to intervene before the temperature becomes harmful to my pets.
Shewhart understood that all processes have variation, but patterns of variation in a process can reveal insights into future defects. That’s why Statistical Process Control is so phenomenal: It allows you to statistically predict defects before they occur.
In The New Economics, Deming explained variation with the example of an insurance actuary who was constantly late. After reading his true-to-life example, I realized I’d had the exact same experience. At one of my startups, I had a software developer—let’s call him Bob—who was one of the best I’d ever worked with. Gifted, he took to code like a fish to water. He did have one annoying habit: he was chronically ten to fifteen minutes late every morning, getting to our thirty-minute morning team meeting halfway through. Then he’d always say, “You won’t believe what happened to me this morning!” and spend another five minutes regaling us with a story. One time he was late because a goth guy got his foot stuck in the subway door. Another time, there was a protest outside of a furrier. A neighborhood parade. A fight outside a bagel shop.
You see, Bob didn’t grow up in New York City. He didn’t know that these kinds of weird things happen all the time. While each occurrence was unique, overall, these were common occurrences. They always made him ten to fifteen minutes late. Even if he couldn’t predict what would happen, he could always count on something happening. As such, he should have planned the process of his morning commute to take fifteen minutes longer than expected. Every. Single. Day. And every single day, the team would lose fifteen to twenty minutes of our thirty-minute meeting between waiting on Bob to show up and hearing the latest edition of “You won’t believe this . . . .” Firing Bob was out of the question; he was simply too valuable. And he was a hard worker; he spent hours of his own time at home working on problems and coding issues. There was no way I was going to chew him out for being a few minutes late every morning.
If I’d been smarter, I would have realized I was dealing with a common-cause problem: I could always count on Bob being ten to fifteen minutes late . . . so, why didn’t I just start our morning meeting at 8:15? Neither Bob nor I understood that his being late was common-cause variation—something normal in day-to-day commuting in the Big Apple, regardless of Bob’s exotic story du jour.
Here’s another example of misidentifying variation. The manager of a datacenter misidentified a special case as something common to the system. The datacenter was built out in the backwoods of an area that rarely experienced snow. Well, one day, there was a freak snowstorm. It was so bad that no cars could get on the road. None of the center’s call and operations staff could get to work that day. The datacenter’s functions were limited, and the company lost a lot of money.
The odds of this happening again were quite slim. However, the datacenter manager decided that he would never let this happen again and mandated that all new hires who were part of the call and operations staff had to live within a mile of the datacenter. He misidentified special-cause variation as something that should be predictable (i.e., common-cause variation).
Firing Bob would have been applying special-cause logic to a common-cause situation. However, the datacenter manager requiring all new hires to live within a mile of the place was applying common-cause logic to a special-cause situation.
According to Shewhart’s Statistical Process Control, managers shouldn’t waste their time trying to fix every single problem. Instead, they should identify which ones can be predicted and fix them. Identify the ones that will likely never happen again and don’t make knee-jerk decisions. As a result, managers can spend their time on things they can control and waste very little of their time on things they can’t.
When Ed was introduced to Shewhart years later, after he had left Hawthorne, the lightbulb went off. Ed already knew variation was a fact of life; as a mathematical physicist, he was intimately familiar with thinking in statistics and probabilities. Shewhart’s insights quantified this variability in manufacturing.
Since the randomness could be predicted, Ed understood that defects weren’t due to the workers but to how the manufacturing process was designed and operated. As we’ll see, this understanding—that workers weren’t the problem in any given system—would go on to become one of Ed’s most important ideas (the Theory of Variation) in Profound Knowledge. But that wouldn’t be for many years to come.
Deming’s Journey to Profound Knowledge - How Deming Helped Win a War, Altered the Face of Industry, and Holds the Key to Our Future - Part 1 - Chapter 5: Pragmatist
Shewhart imparted one more foundational concept to Ed: the philosophy of pragmatism, what Deming would later call the Theory of Knowledge, the first element in the System of Profound Knowledge. And this uniquely American philosophy began with an unliked and unlikely character who just wanted to measure things. This character was fascinated with a problem that had plagued humanity almost ever since Lucy began making tools. That is, the fundamental challenge of accuracy and standardization in measurement as well as the dichotomy of the pursuit of perfection over the practical.
Le Grand K. Sounds like the stage name of a French rap artist. It is a stage name of sorts, not for a person but a piece of platinum-iridium sitting in three vacuum-sealed bell jars in an environmentally controlled, underground, triple-locked vault outside of Paris since 1879. France’s version of Fort Knox, if you will. It’s so precious that it’s been “out” only three times: 1899, 1939, and 1988.
This antique oddity is dense, twice as dense as lead. The golf ball–sized cylinder is incredibly strong and will never rust. Despite its hardiness, it’s handled with kid gloves. Well, better, actually: its guardians are too afraid to touch it even using gloves. They use a special tool wrapped in filter paper to avoid even the most minute of scratches. Perish the thought of a fingerprint!
Such care is usually only given to highly valuable works of art, like a Fabergé egg. Not even the British crown jewels get this kind of treatment. Curiously, this unremarkable piece of metal is worth only about $43,000. What gives? Why does this smooth little alloy cylinder warrant so much security and care? What’s its significance? Perhaps at risk of sounding melodramatic, this piece of metal is how we measure the world. Or most of it, anyway.
Starting in 1879, this small metallic chunk has served as the international prototype of the kilogram. It doesn’t weigh a kilogram; rather, a kilogram is whatever it weighs. There are six official copies; those, plus the original, are stored at the headquarters of the International Bureau of Weights and Measures in Saint-Cloud, France. Forty replicas were created in 1884 and distributed to a handful of nations. The US, for example, received two (named K4 and K20). All the official measurements for anything measured in kilograms in each country are calibrated to these copies of Le Grand K. Pharmaceutical scales, aerospace calibrators, surgical equipment, you name it: they are all derived from the same international standard. Every forty or so years, the copies are flown to Saint-Cloud to be compared to Le Grand K and recertified as being exactly one kilogram.
The problem is that a kilogram isn’t always a kilogram. Like my own diets, K20 initially “lost” weight in the recertification of 1948 but by its next weigh-in had gained it all back. (Unlike my diets, the differences were measured in micrograms.) K4, like those I envy, consistently lost weight with every check-in. Some of this ebb and flow had to do with differences in how the weights were cleaned and stored, and even how they absorbed atoms floating in the air.
All jokes aside, this was a serious problem. If I’m flying in a thirty-year-old airplane, I want to know that the calipers the engineers used to design the plane are the exact same size as the ones the maintenance people used to service the plane this morning. We can’t have wandering measurements, and many of our global measurements are derived from the kilogram (such as the newton and ampere) as well as derivatives of those measures (such as the pascal and joule) and those measures’ measures (such as the watt, volt, tesla, and lumen).
Imagine how much worse it was before scientists began to standardize measurements. A “foot” was the length of a human foot. Can you imagine King James—LeBron, that is—asking a cobbler for a pair of shoes one foot long?
Everyone used different measures. In the ninth century, Charlemagne decided his foot would be the standard foot everyone in his empire should use. In the twelfth century, King Henry I declared a foot to be one-third the length of his own arm. His arm was thirty-six inches; thus, the twelve-inch foot used only by the US, Myanmar, and Liberia still adhering to the imperial system. In the thirteenth century, King Edward II decided that three grains of barley, or a “barleycorn,” would be used for shoe size measurement, with thirty-six barley-corns equaling one foot.
So many local and regional measures were used that at one point there were something like a quarter-million different measures of length, weight, etc. A pound of lead in one part of the world could be lighter than a pound of lead in another.
The Metre Convention on May 20, 1875, established the metric system, including the kilogram (cue the first Le Grand K) and the meter. The thinkers behind it wanted a system of measurements “for all times; for all people.” The Treaty of the Meter defined a meter as one-half of one-ten millionth the distance between the North Pole and the equator. Seeing as how that was a somewhat inconvenient thing to measure, the more practically minded thinkers agreed on a handy substitute: a meter would be defined as the cord length needed for a clock’s pendulum to travel one swing per second. This was about as close to a universal constant as they could agree on.
But no matter how hard you try to get the perfect measurement, there’s always going to be some troublemaker who comes along and picks holes in it.
In this case, Charles Sanders Pierce. C. S. Pierce was a bona fide member of the Boston Brahmins, the elite of society. It was said that before immigrating to America, the Brahmins sent their servants ahead on the Mayflower to prepare the summer cottage—a blue blood in every sense of the term. He was, however, brilliant.
He was particularly fascinated by weather (he would be employed off and on with the US Weather Service) and precision measurements—pendulums, in particular. He recognized early on that pendulums of different lengths swung a two-second cycle depending on local variations in the Earth’s gravity.
He became obsessed with improving pendulums’ precision. But if pendulums everywhere in the world needed to be of slightly different lengths for a uniform cycle, then a meter in Britain wouldn’t equal a meter in Boston. Not precisely. In 1872, Pierce, along with the famous Supreme Court Justice Oliver Wendell Holmes and others, founded the Metaphysical Club to discuss philosophy. These budding philosophers rejected the deterministic worldview of their European counterparts who espoused Enlightenment. Pierce called their new idea pragmatism.
A proponent of Pierce’s, a man by the name of C. I. Lewis (not to be confused with C. S. Lewis of Narnian fame), authored a book called Mind and the World-Order. The ideas it contained were fundamental to Shewhart’s creation was rooted in the thinking of Aristotle. In the Enlightenment’s way of thinking, you could know something without needing evidence to prove it. Descartes wrote, “I think; therefore, I am.” He knew something without needing any kind of outside validation or evidence—a priori knowledge. Another example would be knowing that one plus one equals two. Philosophers don’t need to run an experiment to know the answer is two; they just do.
A posteriori knowledge is when you know something because, and only because, you have the evidence to prove it. A priori thinking would say, “Add a gallon of milk to a gallon of milk and you’ll have two gallons of milk.” Pierce’s new branch of philosophy would say, “I know I have two gallons of milk because I added one gallon of milk to a gallon I already had.” One theorizes; the other experiments and observes.
A priori thinkers believed they could reason their way out of anything inside their own heads; a posteriori thinkers, or “pragmatists,” believed they could reason their way through something only by doing it. Put another way: They believed experience was the best teacher.
With his pendulums, for example, Pierce realized it was possible to create a perfectly precise pendulum. Theoretically. Practically speaking, there was a point where creating an ever-better pendulum simply wasn’t worth it.
Mathematically, he had reached a point of diminishing returns: investing more time and money simply didn’t make any sense. Philosophically, he reasoned that using a pendulum’s swing as the basic unit of measurement simply wasn’t practical.
That’s why in 1877, just two years after the world’s top minds agreed on using grandfather clocks as the basis for all physical measurements of distance in the universe, Pierce wrote that the meter should be tied to an unalterable, absolute unit of measurement: a certain number of wavelengths of light at a certain frequency.
His ideas were taken up and improved upon by Albert Michelson, who would win the 1907 Nobel Prize in Physics for measuring the prototype of a meter to within one-tenth of a wavelength. In 2019, one hundred and forty-four years to the day, the successors of the Treaty of the Meter redefined the meter in terms of the speed of light. They also redefined the kilogram from being measured by an expensive paperweight to being a derivative of Planck’s universal constant.
The story of accuracy and standardization is, in reality, a story of pragmatism. At any given time, a standard is a measurement that suffices and that everybody agrees upon. Even Planck’s universal constant is a pragmatic approximation of the true value. The idea of a pragmatic approach to standards coupled with the accuracy of measurement (described in Chapters 3 and 4) is a great example of why Profound Knowledge is so revelatory. If we focused only on the Theory of Variation without applying the Theory of Knowledge, we’d miss the opportunity to improve upon standards (Pierce, for example, finding a better way to measure the meter).
Jazz may be America’s one truly original musical art form. Pragmatism is America’s one truly original contribution to philosophy.
Consider the number pi. We learn about the basic concept in high-school algebra. It’s simply a circle’s circumference divided by its radius. When we do the math, however, we quickly discover that the division problem just keeps going and going. Our teachers tell us to round it to two decimal places and simply use 3.14.
A pragmatist says, “Okay, 3.14 is good enough for what we’re doing here—it doesn’t make sense to find its absolute value. We’ve got too many other things to do.” While the determinist would spend his life calculating all the digits in pi, the pragmatist would say, “I can’t spend my life doing a math problem—I’ve got real problems to solve. Two decimal places is close enough.”
While this simple example sounds like common sense, most people are still stuck in the absolute deterministic approach. They believe that perfection can—and should!—be reached.
Pragmatists begin with observations and empirical data—in other words, hard evidence—then work their way backward. This might sound simple, but it’s actually quite rare for people to think like this. Most rely on “common knowledge,” knee-jerk reactions, and going along with “the way we’ve always done it around here.”
In Taylor’s and Ford’s time, quality inspection was essentially a question of whether a product was close enough to perfect to pass. But Shewhart, bringing a pragmatic outlook, asked, why wasn’t it perfect? In so many words, he said, “Look, perfection is an illusion. We’ll never reach it. But if we use a posteriori thinking, we can systematically improve how we manage what we manufacture.”
Let me provide a modern example of the perfection mentality. There was a time when the owners and managers of banks, insurance companies, and major retailers wanted their websites to have 100% uptime; they didn’t want their websites to be down. Ever. That may have been the goal, but the reality was too chaotic to adhere to corporate policy. Well, if the suits couldn’t have 100%, how about 99%? Or 99.9%? Could the IT guys get it up to 99.99%? Of course, five nines would be even better! Just how many extra nines could the IT department get to? This went on for years (and still goes on today).
Then Google published a book called Site Reliability Engineering. It shed light on how Google had grown by gathering data from an exponential explosion in the number of websites in the world. Back when Yahoo! launched, there were about three thousand sites. Right now, that number is in the ten figures. How did Google manage its systems to collect all that data? They realized that absolute reliability was impossible and abandoned the goal of 100% uptime. Instead, they asked themselves two simple questions:
- How much more money would we make with an extra 9?
- Does adding that extra 9 cost more than we would make?
Because Google makes its money primarily from advertising, it could calculate how much money it lost during a web outage down to a fraction of a cent. As such, they could calculate the company’s pragmatic limit. For the sake of simplicity, let’s say Google makes $10 million a month from selling ads. That’s roughly $240 a minute. For this example, let’s further say they have 99.9% reliability, meaning their website is down for only 43 minutes a month. That means they lose $10,320 a month.
A new manager takes over and says that’s not good enough. He needs to brag to his golfing buddies that his systems have 99.99% reliability. That would equal only 4.3 minutes of downtime a month. So, how much would that extra nine cost? If going from 99.9% to 99.99% uptime cost only $10,000 a month, they would net an extra $320 a month. In this case, Google would probably invest the time and resources.
However, if the cost of that extra nine was $100,000 a month, they would lose almost $90,000 a month. In this scenario, Google would not pursue 99.99% reliability. It’s simply not economical to make their website more reliable than it already is.
Pierce realized that creating the perfect pendulum was a never-ending quest. Google realized that creating a perfect service was a never-ending goal. Shewhart realized that creating a perfect manufacturing process was a never-ending process.
You have to find the pragmatic limits.
Let’s go back to C. I. Lewis’s two types of knowledge: a priori (drawing your conclusions beforehand) and a posteriori (drawing conclusions after the fact). Pragmatists dismiss a priori thinkers. In their view, you can’t know anything without starting first with some evidence.
This ties neatly into non-determinism: You can’t be certain the apple will hit the ground until you drop it and it hits the ground. Pragmatists wouldn’t assume that Schrödinger’s cat was dead or alive at all; they’d simply open the box. Let’s look at another instance of a priori thinking with a typical survey in my IT community of DevOps. An annual survey might send out a multiple-choice question such as, “On average, how often do you deliver software?”
The choices might be:
- once a year
- once a month
- once a week
- once a day
- once an hour
Before getting the answers back, the analysts behind the survey have already concluded that organizations delivering once an hour or a day should be categorized as “high performers.” Those delivering once a week or a month are “medium performers.” Those delivering once a year are “low performers.”
The analysts draw their conclusions before even collecting the data. That’s a priori thinking.
However, the reason a “low-performing” team might deploy only once a year could be because that’s when their space probe is within line of sight. That doesn’t make them low performing. You just can’t assume you know the answers ahead of time.
A pragmatist, on the other hand, might ask the same question: “On average, how often do you deliver software?” But instead of predetermining the responses, they might engage in a question-and-answer type of feedback:
- Respondent 1: “Every day, but it’s hard to do on Mondays and Fridays.”
- Analyst: “Why is it hard on those days?”
- Respondent 2: “Whenever it’s ready to be shipped.”
- Analyst: “Do you need approval to deliver it?”
- Respondent 3: “Only when my manager tells me I can.”
- Analyst: “And when is that?”
The pragmatic analyst would collect the data and then draw conclusions about which software teams were high, medium, or low performers. That is, a posteriori, or after the fact.
When Shewhart introduced Deming to the philosophy of pragmatism in 1927, the young protégé was amply prepared to receive it. Non-determinism had shown Ed that what “everyone” knew and accepted was either more nuanced or simply wrong. The only reliable source of knowledge came from empirical evidence. Pragmatism simply made sense. Ever thereafter, what Ed taught never came from tradition or hearsay but from hard facts and careful study.
The Shewhart Cycle & the Deming Wheel
Walter Shewhart used the philosophy of pragmatism to completely rethink manufacturing. Most know the result of this today as the PDSA cycle (plan, do, study, act), what Ed called the Shewhart cycle. It has become the template to improve virtually every type of system or process:
- First, gather evidence to create a hypothesis: What needs to change?
- Second, make the change.
- Third, review what happened: Is the process better or worse? Why?
- Last, decide where to go from here: Revert to before? Iterate further?
By continually gathering data, you can continually improve your process. My software developer, Bob, didn’t have this mentality. If he did, he would have experimented with his process of getting to work. Perhaps he could have planned to catch an earlier train. He could have experimented with taking his breakfast to go instead of buying it at the train station. Maybe by laying his clothes out the night before, he’d be able to leave his apartment sooner. He could have continually experimented with all these variables to see if they helped shave some time off his commute so he could get to work closer to 8:00 a.m. instead of 8:15 a.m.
Now, think of this on the scale of mass-manufacturing. There were one hundred thousand different parts and components to a Hawthorne Works telephone assemblage. Shewhart formalized a way to continually make incremental changes. Perhaps the managers found that replacing a machine’s rubber gasket every month instead of every other month reduced the number of no-gos produced. Or maybe a certain component made on a certain line got slightly misaligned over time and needed to be checked more frequently than on other lines.
Think about the shift in mentality. Before PDSA, the managers at Hawthorne Works would look at the huge pile of scrap at the end of the day and say,
“It is what it is. Sure wish we could find better workers.” With PDSA, managers had a formal tool to help them track bad quality, make a change, and see if it improved product quality. They had a way to make Hawthorne Works better and better over time. They no longer made changes based on gut instinct; they had actual numbers to guide them.
But Shewhart thought only in terms of manufacturing.
It would be Deming who saw the bigger picture.