4.5 How to live with systems
In her essay “Dancing with systems”, Donella Meadows, who we met in the "Introduction to Planetary Well-being" course, shares her experiences and lessons learned as a systems scientist. She warns against the naïve idea - which she also had at first - that systems analysis and modelling allow systems to be predicted and controlled. As a young researcher, she and other scientists exaggerated their ability to change the world. However, they did not intend to mislead others, they truly hoped and believed that systems thinking would enable them to make systems work.
But self-organizing, nonlinear, feedback systems are inherently unpredictable. They are not controllable. They are understandable only in the most general way. The goal of foreseeing the future exactly and preparing for it perfectly is unrealizable. The idea of making a complex system do just what you want it to do can be achieved only temporarily, at best. We can never fully understand our world, not in the way our reductionistic science has led us to expect. Our science itself, from quantum theory to the mathematics of chaos, leads us into irreducible uncertainty. For any objective other than the most trivial, we can’t optimize; we don’t even know what to optimize. We can’t keep track of everything. We can’t find a proper, sustainable relationship to nature, each other, or the institutions we create, if we try to do it from the role of omniscient conqueror.
For those who stake their identity on the role of omniscient conqueror, the uncertainty exposed by systems thinking is hard to take. If you can’t understand, predict, and control, what is there to do?
Systems thinking leads to another conclusion–however, waiting, shining, obvious as soon as we stop being blinded by the illusion of control. It says that there is plenty to do, of a different sort of “doing.” The future can’t be predicted, but it can be envisioned and brought lovingly into being. Systems can’t be controlled, but they can be designed and redesigned. We can’t surge forward with certainty into a world of no surprises, but we can expect surprises and learn from them and even profit from them. We can’t impose our will upon a system. We can listen to what the system tells us, and discover how its properties and our values can work together to bring forth something much better than could ever be produced by our will alone.
We can’t control systems or figure them out. But we can dance with them!
I already knew that, in a way before I began to study systems. I had learned about dancing with great powers from whitewater kayaking, from gardening, from playing music, from skiing. All those endeavors require one to stay wide-awake, pay close attention, participate flat out, and respond to feedback. It had never occurred to me that those same requirements might apply to intellectual work, to management, to government, to getting along with people.
But there it was, the message emerging from every computer model we made. Living successfully in a world of systems requires more of us than our ability to calculate. It requires our full humanity–our rationality, our ability to sort out truth from falsehood, our intuition, our compassion, our vision, and our morality.
Meadows compiled a list of the most important lessons of "systems wisdom" that he and others who have worked with systems for a long time have consciously or unconsciously adopted.
1. Get the Beat
Before you disturb the system in any way, watch how it behaves. If it’s a piece of music or a whitewater rapid or a fluctuation in a commodity price, study its beat. If it’s a social system, watch it work. Learn its history. Ask people who’ve been around a long time to tell you what has happened. If possible, find or make a time graph of actual data from the system. Peoples’ memories are not always reliable when it comes to timing.
Starting with the behavior of the system forces you to focus on facts, not theories. It keeps you from falling too quickly into your own beliefs or misconceptions, or those of others. It’s amazing how many misconceptions there can be. People will swear that rainfall is decreasing, say, but when you look at the data, you find that what is really happening is that variability is increasing–the droughts are deeper, but the floods are greater too. [- - -]
Starting with the behavior of the system directs one’s thoughts to dynamic, not static analysis–not only to “what’s wrong?” but also to “how did we get there?” and “what behavior modes are possible?” and “if we don’t change direction, where are we going to end up?”
And finally, starting with history discourages the common and distracting tendency we all have to define a problem not by the system’s actual behavior, but by the lack of our favorite solution.
Examples of defining the problem as the lack of a preferred solution could be "The problem is that more electric cars are needed" or "The problem is how to attract more people to the city".
2. Listen to the wisdom of the system.
Aid and encourage the forces and structures that help the system run itself. Don’t be an unthinking intervener and destroy the system’s own self-maintenance capacities. Before you charge in to make things better, pay attention to the value of what’s already there.
For example, small local entrepreneurs may have ready-made networks and practices that, if given the right support, can help them create a thriving business. Job creation does not necessarily require large industrial investments that may destroy existing businesses.
3. Expose your mental models to the open air.
Remember, always, that everything you know, and everything everyone knows, is only a model. Get your model out there where it can be shot at. Invite others to challenge your assumptions and add their own. Instead of becoming a champion for one possible explanation or hypothesis or model, collect as many as possible. Consider all of them plausible until you find some evidence that causes you to rule one out. That way you will be emotionally able to see the evidence that rules out an assumption with which you might have confused your own identity.
You don’t have to put forth your mental model with diagrams and equations, though that’s a good discipline. You can do it with words or lists or pictures or arrows showing what you think is connected to what. The more you do that, in any form, the clearer your thinking will become, the faster you will admit your uncertainties and correct your mistakes, and the more flexible you will learn to be. Mental flexibility–the willingness to redraw boundaries, to notice that a system has shifted into a new mode, to see how to redesign structure — is a necessity when you live in a world of flexible systems.
4. Stay humble. Stay a learner.
Even the best scientists do not fully understand the world and how it works.
The thing to do, when you don’t know, is not to bluff and not to freeze, but to learn. The way you learn is by experiment–or, as Buckminster Fuller put it, by trial and error, error, error. In a world of complex systems it is not appropriate to charge forward with rigid, undeviating directives. “Stay the course” is only a good idea if you’re sure you’re on course. Pretending you’re in control even when you aren’t is a recipe not only for mistakes, but for not learning from mistakes. What’s appropriate when you’re learning is small steps, constant monitoring, and a willingness to change course as you find out more about where it’s leading.
That’s hard. It means making mistakes and, worse, admitting them. It means what psychologist Don Michael calls “error-embracing.” It takes a lot of courage to embrace your errors.
5. Honor and protect information.
A decision maker can’t respond to information he or she doesn’t have, can’t respond accurately to information that is inaccurate, can’t respond in a timely way to information that is late. I would guess that 99 percent of what goes wrong in systems goes wrong because of faulty or missing information.
If I could, I would add an Eleventh Commandment: Thou shalt not distort, delay, or sequester information. You can drive a system crazy by muddying its information streams. You can make a system work better with surprising ease if you can give it more timely, more accurate, more complete information.
Meadows gives the example of the US Freedom of Information Act, which made chemical releases from industrial plants public and led local newspapers to publish lists of the biggest polluters in the area. As a result, chemical emissions nationwide were reduced by 40% in two years, without any litigation or demands for emission reductions.
6. Locate responsibility in the system.
Look for the ways the system creates its own behavior. Do pay attention to the triggering events, the outside influences that bring forth one kind of behavior from the system rather than another. Sometimes those outside events can be controlled (as in reducing the pathogens in drinking water to keep down incidences of infectious disease.) But sometimes they can’t. And sometimes blaming or trying to control the outside influence blinds one to the easier task of increasing responsibility within the system.
“Intrinsic responsibility” means that the system is designed to send feedback about the consequences of decision-making directly and quickly and compellingly to the decision-makers.
Increasing intrinsic responsibility could mean, for example, individual apartment water meters, where each apartment pays directly for the hot water it uses, or requiring an industrial plant to take its service water from the downstream side of its own waste water pipe.
7. Make feedback policies for feedback systems.
You can imagine why a dynamic, self-adjusting system cannot be governed by a static, unbending policy. It’s easier, more effective, and usually much cheaper to design policies that change depending on the state of the system. Especially where there are great uncertainties, the best policies not only contain feedback loops, but meta-feedback loops–loops that alter, correct, and expand loops. These are policies that design learning into the management process.
An example of this could be the EU's planned carbon tariffs, which, if implemented, would reduce the competitive advantage of companies operating outside the EU carbon trading scheme, thereby supporting companies operating within the EU and acting as an incentive for stricter emissions policies in non-EU countries.
8. Pay attention to what is important, not just what is quantifiable.
Our culture, obsessed with numbers, has given us the idea that what we can measure is more important than what we can’t measure. You can look around and make up your own mind about whether quantity or quality is the outstanding characteristic of the world in which you live.
If something is ugly, say so. If it is tacky, inappropriate, out of proportion, unsustainable, morally degrading, ecologically impoverishing, or humanly demeaning, don’t let it pass. Don’t be stopped by the “if you can’t define it and measure it, I don’t have to pay attention to it” ploy. No one can precisely define or measure justice, democracy, security, freedom, truth, or love. No one can precisely define or measure any value. But if no one speaks up for them, if systems aren’t designed to produce them, if we don’t speak about them and point toward their presence or absence, they will cease to exist.
9. Go for the good of the whole.
Don’t maximize parts of systems or subsystems while ignoring the whole. As Kenneth Boulding once said, Don’t go to great trouble to optimize something that never should be done at all. Aim to enhance total systems properties, such as creativity, stability, diversity, resilience, and sustainability–whether they are easily measured or not.
As you think about a system, spend part of your time from a vantage point that lets you see the whole system, not just the problem that may have drawn you to focus on the system to begin with. And realize, that, especially in the short term, changes for the good of the whole may sometimes seem to be counter to the interests of a part of the system. It helps to remember that the parts of a system cannot survive without the whole. The long term interests of your liver require the long term health of your body, and the long term interests of sawmills require the long-term health of forests.
10. Expand time horizons.
The official time horizon of industrial society doesn’t extend beyond what will happen after the next election or beyond the payback period of current investments. The time horizon of most families still extends farther than that–through the lifetimes of children or grandchildren. Many Native American cultures actively spoke of and considered in their decisions the effects upon the seventh generation to come. The longer the operant time horizon, the better the chances for survival.
In the strict systems sense there is no long-term/short-term distinction. Phenomena at different time-scales are nested within each other. Actions taken now have some immediate effects and some that radiate out for decades to come. We experience now the consequences of actions set in motion yesterday and decades ago and centuries ago.
When you’re walking along a tricky, curving, unknown, surprising, obstacle-strewn path, you’d be a fool to keep your head down and look just at the next step in front of you. You’d be equally a fool just to peer far ahead and never notice what’s immediately under your feet. You need to be watching both the short and the long term–the whole system.
11. Expand thought horizons.
Defy the disciplines. In spite of what you majored in, or what the textbooks say, or what you think you’re an expert at, follow a system wherever it leads. It will be sure to lead across traditional disciplinary lines. To understand that system, you will have to be able to learn from–while not being limited by–economists and chemists and psychologists and theologians. You will have to penetrate their jargons, integrate what they tell you, recognize what they can honestly see through their particular lenses, and discard the distortions that come from the narrowness and incompleteness of their lenses. They won’t make it easy for you.
Seeing systems whole requires more than being “interdisciplinary,” if that word means, as it usually does, putting together people from different disciplines and letting them talk past each other. Interdisciplinary communication works only if there is a real problem to be solved, and if the representatives from the various disciplines are more committed to solving the problem than to being academically correct. They will have to go into learning mode, to admit ignorance and be willing to be taught, by each other and by the system.
It can be done. It’s very exciting when it happens.
12. Expand the boundary of caring.
Living successfully in a world of complex systems means expanding not only time horizons and thought horizons; above all it means expanding the horizons of caring. There are moral reasons for doing that, of course. And if moral arguments are not sufficient, then systems thinking provides the practical reasons to back up the moral ones. The real system is interconnected. No part of the human race is separate either from other human beings or from the global ecosystem. It will not be possible in this integrated world for your heart to succeed if your lungs fail, or for your company to succeed if your workers fail, or for the rich in Los Angeles to succeed if the poor in Los Angeles fail, or for Europe to succeed if Africa fails, or for the global economy to succeed if the global environment fails.
As with everything else about systems, most people already know about the interconnections that make moral and practical rules turn out to be the same rules. They just have to bring themselves to believe that which they know.
13. Celebrate complexity.
Let’s face it, the universe is messy. It is nonlinear, turbulent and chaotic. It is dynamic. It spends its time in transient behavior on its way to somewhere else, not in mathematically neat equilibria. It self-organizes and evolves. It creates diversity, not uniformity. That’s what makes the world interesting, that’s what makes it beautiful, and that’s what makes it work.
There’s something within the human mind that is attracted to straight lines and not curves, to whole numbers and not fractions, to uniformity and not diversity, and to certainties and not mystery. But there is something else within us that has the opposite set of tendencies, since we ourselves evolved out of and are shaped by and structured as complex feedback systems. Only a part of us, a part that has emerged recently, designs buildings as boxes with uncompromising straight lines and flat surfaces. Another part of us recognizes instinctively that nature designs in fractals, with intriguing detail on every scale from the microscopic to the macroscopic. That part of us makes Gothic cathedrals and Persian carpets, symphonies and novels, Mardi Gras costumes and artificial intelligence programs, all with embellishments almost as complex as the ones we find in the world around us.
14. Hold fast to the goal of goodness.
Examples of bad human behavior are held up, magnified by the media, affirmed by the culture, as typical. Just what you would expect. After all, we’re only human. The far more numerous examples of human goodness are barely noticed. They are Not News. They are exceptions. Must have been a saint. Can’t expect everyone to behave like that.
And so expectations are lowered. The gap between desired behavior and actual behavior narrows. Fewer actions are taken to affirm and instill ideals. The public discourse is full of cynicism. Public leaders are visibly, unrepentantly, amoral or immoral and are not held to account. Idealism is ridiculed. Statements of moral belief are suspect. It is much easier to talk about hate in public than to talk about love.
If we allow past patterns of behaviour to influence what we think is right or desirable, we will erode our goals and allow the system to slide towards poor performance. To prevent degradation, we need to stick to the standards we set, or better still, raise the standards by setting an example of best practice.
This is quite a list. Systems thinking can only tell us to do these things. It can’t do them for us.
And so we are brought to the gap between understanding and implementation. Systems thinking by itself cannot bridge that gap. But it can lead us to the edge of what analysis can do and then point beyond–to what can and must be done by the human spirit.