Thinking in Systems

Donella Meadows

Buy on Amazon

Highly Recommend

The best resource on systems that I've read. The part on leverage points is especially useful, however I ended up with a ton of notes on this one because of how good it was! Once you understand systems, everything becomes a system.


A system is an interconnected set of elements that is coherently organized in a way that achieves something. If you look at that definition closely for a minute, you can see that a system must consist of three kinds of things: elements, interconnections, and a function or purpose.

Some interconnections in systems are actual physical flows, such as the water in the tree’s trunk or the students progressing through a university. Many interconnections are flows of information—signals that go to decision points or action points within a system.

If a frog turns right and catches a fly, and then turns left and catches a fly, and then turns around backward and catches a fly, the purpose of the frog has to do not with turning left or right or backward but with catching flies. If a government proclaims its interest in protecting the environment but allocates little money or effort toward that goal, environmental protection is not, in fact, the government’s purpose. Purposes are deduced from behavior, not from rhetoric or stated goals.

The elements, the parts of systems we are most likely to notice, are often (not always) least important in defining the unique characteristics of the system—unless changing an element also results in changing relationships or purpose.

On top of that, when we do focus on flows, we tend to focus on inflows more easily than on outflows. Therefore, we sometimes miss seeing that we can fill a bathtub not only by increasing the inflow rate, but also by decreasing the outflow rate. Everyone understands that you can prolong the life of an oil-based economy by discovering new oil deposits. It seems to be harder to understand that the same result can be achieved by burning less oil.

Water can’t run out the drain instantly, even if you open the drain all the way.

People often underestimate the inherent momentum of a stock. It takes a long time for populations to grow or stop growing, for wood to accumulate in a forest, for a reservoir to fill up, for a mine to be depleted. An economy cannot build up a large stock of functioning factories and highways and electric plants overnight, even if a lot of money is available.

If you have a sense of the rates of change of stocks, you don’t expect things to happen faster than they can happen. You don’t give up too soon. [examples: changes in health, career]

Balancing feedback loops are goal-seeking or stability-seeking. Each tries to keep a stock at a given value or within a range of values. A balancing feedback loop opposes whatever direction of change is imposed on the system. If you push a stock too far up, a balancing loop will try to pull it back down. If you shove it too far down, a balancing loop will try to bring it back up.

When someone tells you that population growth causes poverty, you’ll ask yourself how poverty may cause population growth. THINK ABOUT THIS: If A causes B, is it possible that B also causes A?

A Stock with Two Competing Balancing Loops—a Thermostat... Now, what happens when these two loops operate at the same time? Assuming that there is sufficient insulation and a properly sized furnace, the heating loop dominates the cooling loop.

The information delivered by a feedback loop can only affect future behavior; it can’t deliver the information, and so can’t have an impact fast enough to correct behavior that drove the current feedback. A person in the system who makes a decision based on the feedback can’t change the behavior of the system that drove the current feedback; the decisions he or she makes will affect only future behavior. [General principle]

If you’re gearing up your work force to a higher level, you have to hire fast enough to correct for those who quit while you are hiring.

In other words, your mental model of the system needs to include all the important flows, or you will be surprised by the system’s behavior.

What happens when a reinforcing and a balancing loop are both pulling on the same stock? This is one of the most common and important system structures. Among other things, it describes every living population and every economy. A population has a reinforcing loop causing it to grow through its birth rate, and a balancing loop causing it to die off through its death rate. [Questions to ask to determine if the model represent reality:]

  • Are the driving factors likely to unfold this way? (What are birth rate and death rate likely to do?)
  • If they did, would the system react this way? (Do birth and death rates really cause the population stock to behave as we think it will?)
  • What is driving the driving factors? (What affects birth rate? What affects death rate?)

One of the central insights of systems theory, as central as the observation that systems largely cause their own behavior, is that systems with similar feedback structures produce similar dynamic behaviors, even if the outward appearance of these systems is completely dissimilar.

[Limits-to-growth archetype: Growth in a constrained environment:] But any real physical entity is always surrounded by and exchanging things with its environment. A corporation needs a constant supply of energy and materials and workers and managers and customers. A growing corn crop needs water and nutrients and protection from pests. A population needs food and water and living space, and if it’s a human population, it needs jobs and education and health care and a multitude of other things.

What is not assumed to be constant is the yield of resource per unit of capital [IE decreasing marginal utility of capital]. Because this resource is not renewable, as in the case of oil, the stock feeding the extraction flow does not have an input. As the resource is extracted—as an oil well is depleted—the next barrel of oil becomes harder to get.

A quantity growing exponentially toward a constraint or limit reaches that limit in a surprisingly short time.

The same behavior results, by the way, if prices don’t go up but if technology brings operating costs down—as has actually happened, for example, with advanced recovery techniques from oil wells

Nonrenewable resources are stock-limited [oil in an oilfield]. The entire stock is available at once, and can be extracted at any rate (limited mainly by extraction capital). But since the stock is not renewed, the faster the extraction rate, the shorter the lifetime of the resource.

Renewable resources are flow-limited [fish in an ocean]. They can support extraction or harvest indefinitely, but only at a finite flow rate equal to their regeneration rate. If they are extracted faster than they regenerate, they may eventually be driven below a critical threshold and become, for all practical purposes, nonrenewable.

If the land mechanism as a whole is good, then every part is good, whether we understand it or not. If the biota, in the course of aeons, has built something we like but do not understand, then who but a fool would discard seemingly useless parts? To keep every cog and wheel is the first precaution of intelligent tinkering. —Aldo Leopold, forester

Why do systems work so well? Consider the properties of highly functional systems—machines or human communities or ecosystems—which are familiar to you. Chances are good that you may have observed one of three characteristics: resilience, self-organization, or hierarchy.

For our purposes, the normal dictionary meaning will do: “the ability to bounce or spring back into shape, position, etc., after being pressed or stretched. Elasticity. The ability to recover strength, spirits, good humor, or any other aspect quickly.” Resilience is a measure of a system’s ability to survive and persist within a variable environment. The opposite of resilience is brittleness or rigidity.

Resilience arises from a rich structure of many feedback loops that can work in different ways to restore a system even after a large perturbation.

Resilience is something that may be very hard to see, unless you exceed its limits, overwhelm and damage the balancing loops, and the system structure breaks down. Because resilience may not be obvious without a whole-system view, people often sacrifice resilience for stability, or for productivity, or for some other more immediately recognizable system property.

Systems need to be managed not only for productivity or stability, they also need to be managed for resilience—the ability to recover from perturbation, the ability to restore or repair themselves. [Talebian vibes]

This capacity of a system to make its own structure more complex is called self-organization.

Like resilience, self-organization is often sacrificed for purposes of short-term productivity and stability. Productivity and stability are the usual excuses for turning creative human beings into mechanical adjuncts to production processes.

These conditions that encourage self-organization often can be scary for individuals and threatening to power structures. As a consequence, education systems may restrict the creative powers of children instead of stimulating those powers.

A cell in your liver is a subsystem of an organ, which is a subsystem of you as an organism, and you are a subsystem of a family, an athletic team, a musical group, and so forth. These groups are subsystems of a town or city, and then a nation, and then the whole global socioeconomic system that dwells within the biosphere system. This arrangement of systems and subsystems is called a hierarchy.

If you have a liver disease, for example, a doctor usually can treat it without paying much attention to your heart or your tonsils (to stay on the same hierarchical level) or your personality (to move up a level or two) or the DNA in the nuclei of the liver cells (to move down several levels). There are just enough exceptions to that rule, however, to reinforce the necessity of stepping back to consider the whole hierarchy. Maybe your job exposes you to a chemical that is damaging your liver. Maybe the disease originates in a malfunction of the DNA.

When a subsystem’s goals dominate at the expense of the total system’s goals, the resulting behavior is called suboptimization.

Everything we think we know about the world is a model.

Our models usually have a strong congruence with the world. However, and conversely, our models fall far short of representing the world fully. That is why we make mistakes and why we are regularly surprised.

You can’t navigate well in an interconnected, feedback-dominated world unless you take your eyes off short-term events and look for long term behavior and structure; unless you are aware of false boundaries and bounded rationality; unless you take into account limiting factors, nonlinearities and delays. You are likely to mistreat, misdesign, or misread systems if you don’t respect their properties of resilience, self-organization, and hierarchy.

Like the tip of an iceberg rising above the water, events are the most visible aspect of a larger complex—but not always the most important.

When a systems thinker encounters a problem, the first thing he or she does is look for data, time graphs, the history of the system. That’s because long term behavior provides clues to the underlying system structure. And structure is the key to understanding not just what is happening, but why.

Listen to every night’s explanation of why the stock market did what it did. Stocks went up (down) because the U.S. dollar fell (rose), or the prime interest rate rose (fell), or the Democrats won (lost), or one country invaded another (or didn’t). Event-event analysis.

Economic news reports on the national production (flow) of goods and services, the GNP, rather than the total physical capital (stock) of the nation’s factories and farms and businesses that produce those goods and services. But without seeing how stocks affect their related flows through feedback processes, one cannot understand the dynamics of economic systems or the reasons for their behavior.

There’s no reason to expect any flow to bear a stable relationship to any other flow. Flows go up and down, on and off, in all sorts of combinations, in response to stocks, not to other flows.

[Example of above] As the flow of traffic on a highway increases, car speed is affected only slightly over a large range of car density. Eventually, however, small further increases in density produce a rapid drop-off in speed. And when the number of cars on the highway builds up to a certain point, it can result in a traffic jam, and car speed drops to zero.

It’s a great art to remember that boundaries are of our own making, and that they can and should be reconsidered for each new discussion, problem, or purpose.

Rich countries transfer capital or technology to poor ones and wonder why the economies of the receiving countries still don’t develop, never thinking that capital or technology may not be the most limiting factors.

As the economy grows relative to the ecosystem, however, and the limiting factors shift to clean water, clean air, dump space, and acceptable forms of energy and raw materials, the traditional focus on only capital and labor becomes increasingly unhelpful.

Any physical entity with multiple inputs and outputs is surrounded by layers of limits.

Delays are ubiquitous in systems.

When there are long delays in feedback loops, some sort of foresight is essential. To act only when a problem becomes obvious is to miss an important opportunity to solve the problem.

Bounded rationality means that people make quite reasonable decisions based on the information they have. But they don’t have perfect information, especially about more distant parts of the system.

Economic theory as derived from Adam Smith assumes first that homo economicus acts with perfect optimality on complete information, and second that when many of the species homo economicus do that, their actions add up to the best possible outcome for everybody. Neither of these assumptions stands up long against the evidence.

Suppose you are for some reason lifted out of your accustomed place in society and put in the place of someone whose behavior you have never understood. Having been a staunch critic of government, you suddenly become part of government. Or having been a laborer in opposition to management, you become management (or vice versa). In your new position, you experience the information flows, the incentives and disincentives, the goals and discrepancies, the pressures—the bounded rationality—that goes with that position. It’s possible that you retain your memory of how things look from another angle, and that you burst forth with innovations that transform the system, but it’s distinctly unlikely. If you become a manager, you probably will stop seeing labor as a deserving partner in production, and start seeing it as a cost to be minimized. If you become a financier, you probably will overinvest during booms and underinvest during busts, along with all the other financiers. If you become very poor, you will see the short-term rationality, the hope, the opportunity, the necessity of having many children.

Taking out one individual from a position of bounded rationality and putting in another person is not likely to make much difference. Blaming the individual rarely helps create a more desirable outcome. [Super important for designing businesses]

If any one actor gains an advantage and moves the system stock (drug supply) in one direction (enforcement agencies manage to cut drug imports at the border), the others double their efforts to pull it back (street prices go up, addicts have to commit more crimes to buy their daily fixes, higher prices bring more profits, suppliers use the profits to buy planes and boats to evade the border patrols). Together, the countermoves produce a standoff, the stock is not much different from before, and that is not what anybody wants.

The most effective way of dealing with policy resistance is to find a way of aligning the various goals of the subsystems, usually by providing an overarching goal that allows all actors to break out of their bounded rationality... The most familiar examples of this harmonization of goals are mobilizations of economies during wartime, or recovery after war or natural disaster.

Use of common parking spaces in downtown areas are parceled out by meters, which charge for a space and limit the time it can be occupied. You are not free to park wherever you want for as long as you want, but you have a higher chance of finding a parking space than you would if the meters weren’t there. [Example of mutual coercion]

Some systems not only resist policy and stay in a normal bad state, they keep getting worse. One name for this archetype is “drift to low performance.” Examples include falling market share in a business, eroding quality of service at a hospital, continuously dirtier rivers or air, increased fat in spite of periodic diets, the state of America’s public schools—or my onetime jogging program, which somehow just faded away.

But in this system, there is a distinction between the actual system state and the perceived state. The actor tends to believe bad news more than good news. And to complete this tragic archetype, the desired state of the system is influenced by the perceived state. Standards aren’t absolute.

There are two antidotes to eroding goals. One is to keep standards absolute, regardless of performance. Another is to make goals sensitive to the best performances of the past, instead of the worst.

Using accumulated wealth, privilege, special access, or inside information to create more wealth, privilege, access or information are examples of the archetype called “success to the successful.”

Notice that rule beating produces the appearance of rules being followed. Drivers obey the speed limits, when they’re in the vicinity of a police car. Feed grains are no longer imported into Europe. Development does not proceed where an endangered species is documented as present. The “letter of the law” is met, the spirit of the law is not.

These examples confuse effort with result, one of the most common mistakes in designing systems around the wrong goal. Maybe the worst mistake of this kind has been the adoption of the GNP as the measure of national economic success. The GNP is the gross national product, the money value of the final goods and services produced by the economy. As a measure of human welfare, it has been criticized almost from the moment it was invented. [Andrew Yang's ideas address this]

Leverage Points— Places to Intervene in a System

12. Numbers—Constants and parameters such as subsidies, taxes, standards

Consider the national debt. It may seem like a strange stock; it is a money hole. The rate at which the hole deepens is called the annual deficit. Income from taxes shrinks the hole, government expenditures expand it. Congress and the president spend most of their time arguing about the many, many parameters that increase (spending) and decrease (taxing) the size or depth of the hole. Since those flows are connected to us, the voters, these are politically charged parameters. But, despite all the fireworks, and no matter which party is in charge, the money hole has been deepening for years now, just at different rates.

Putting different hands on the faucets may change the rate at which the faucets turn, but if they’re the same old faucets, plumbed into the same old system, turned according to the same old information and goals and rules, the system behavior isn’t going to change much.

11. Buffers—The sizes of stabilizing stocks relative to their flows

Consider a huge bathtub with slow in- and outflows. Now think about a small one with very fast flows. That’s the difference between a lake and a river. You hear about catastrophic river floods much more often than catastrophic lake floods, because stocks that are big, relative to their flows, are more stable than small ones. In chemistry and other fields, a big, stabilizing stock is known as a buffer.

10. Stock-and-Flow Structures—Physical systems and their nodes of intersection

When the Hungarian road system was laid out so all traffic from one side of the nation to the other had to pass through central Budapest, that determined a lot about air pollution and commuting delays that are not easily fixed by pollution control devices, traffic lights, or speed limits.

Physical structure is crucial in a system, but is rarely a leverage point, because changing it is rarely quick or simple. The leverage point is in proper design in the first place. [If it's a physical system, get it right the first time]

9. Delays—The lengths of time relative to the rates of system changes

For example, it takes several years to build an electric power plant that will likely last thirty years. Those delays make it impossible to build exactly the right number of power plants to supply rapidly changing demand for electricity. Even with immense effort at forecasting, almost every electricity industry in the world experiences long oscillations between overcapacity and undercapacity. A system just can’t respond to short-term changes when it has long term delays. That’s why a massive central-planning system, such as the Soviet Union or General Motors, necessarily functions poorly. [Stay small]

I would list delay length as a high leverage point, except for the fact that delays are not often easily changeable.

8. Balancing Feedback Loops—The strength of the feedbacks relative to the impacts they are trying to correct

Some of those loops may be inactive much of the time—like the emergency cooling system in a nuclear power plant, or your ability to sweat or shiver to maintain your body temperature—but their presence is critical to the long term welfare of the system.

The strength of a balancing feedback loop is important relative to the impact it is designed to correct... A thermostat system may work fine on a cold winter day—but open all the windows and its corrective power is no match for the temperature change imposed on the system. Democracy works better without the brainwashing power of centralized mass communications.

7. Reinforcing Feedback Loops—The strength of the gain of driving loops

A balancing feedback loop is self-correcting; a reinforcing feedback loop is self-reinforcing. The more it works, the more it gains power to work some more, driving system behavior in one direction.

6. Information Flows—The structure of who does and does not have access to information

In Chapter Four, we examined the story of the electric meter in a Dutch housing development—in some of the houses the meter was installed in the basement; in others it was installed in the front hall. With no other differences in the houses, electricity consumption was 30 percent lower in the houses where the meter was in the highly visible location in the front hall. I love that story because it’s an example of a high leverage point in the information structure of the system.

Missing information flows is one of the most common causes of system malfunction. Adding or restoring information can be a powerful intervention, usually much easier and cheaper than rebuilding physical infrastructure.

5. Rules—Incentives, punishments, constraints

To demonstrate the power of rules, I like to ask my students to imagine different ones for a college. Suppose the students graded the teachers, or each other. Suppose there were no degrees: You come to college when you want to learn something, and you leave when you’ve learned it. Suppose tenure were awarded to professors according to their ability to solve real world problems, rather than to publish academic papers. Suppose a class got graded as a group, instead of as individuals.

4. Self-Organization—The power to add, change, or evolve system structure

Self-organization means changing any aspect of a system lower on this list—adding completely new physical structures, such as brains or wings or computers—adding new balancing or reinforcing loops, or new rules. The ability to self-organize is the strongest form of system resilience.

Further investigation of self-organizing systems reveals that the divine creator, if there is one, does not have to produce evolutionary miracles. He, she, or it just has to write marvelously clever rules for self-organization. [This is when AI runs out of control]

3. Goals—The purpose or function of the system

If the goal is to bring more and more of the world under the control of one particular central planning system (the empire of Genghis Khan, the Church, the People’s Republic of China, Wal-Mart, Disney), then everything further down the list, physical stocks and flows, feedback loops, information flows, even self-organizing behavior, will be twisted to conform to that goal.

Even people within systems don’t often recognize what whole-system goal they are serving. “To make profits,” most corporations would say, but that’s just a rule, a necessary condition to stay in the game. What is the point of the game? To grow, to increase market share, to bring the world (customers, suppliers, regulators) more and more under the control of the corporation, so that its operations becomes ever more shielded from uncertainty.

2. Paradigms—The mind-set out of which the system—its goals, structure, rules, delays, parameters—arises

Money measures something real and has real meaning; therefore, people who are paid less are literally worth less. Growth is good. Nature is a stock of resources to be converted to human purposes. Evolution stopped with the emergence of Homo sapiens. One can “own” land. Those are just a few of the paradigmatic assumptions of our current culture, all of which have utterly dumbfounded other cultures, who thought them not the least bit obvious.

1. Transcending Paradigms

There is yet one leverage point that is even higher than changing a paradigm. That is to keep oneself unattached in the arena of paradigms, to stay flexible, to realize that no paradigm is “true,” that every one, including the one that sweetly shapes your own worldview, is a tremendously limited understanding of an immense and amazing universe that is far beyond human comprehension.

Surely there is no power, no control, no understanding, not even a reason for being, much less acting, embodied in the notion that there is no certainty in any worldview.

If no paradigm is right, you can choose whatever one will help to achieve your purpose. If you have no idea where to get a purpose, you can listen to the universe.

It is in this space of mastery over paradigms that people throw off addictions, live in constant joy, bring down empires, get locked up or burned at the stake or crucified or shot, and have impacts that last for millennia.

The real trouble with this world of ours is not that it is an unreasonable world, nor even that it is a reasonable one. The commonest kind of trouble is that it is nearly reasonable, but not quite. Life is not an illogicality; yet it is a trap for logicians. It looks just a little more mathematical and regular than it is. —G. K. Chesterton, 20th century writer

I had learned about dancing with great powers from whitewater kayaking, from gardening, from playing music, from skiing. All those endeavors require one to stay wide awake, pay close attention, participate flat out, and respond to feedback. It had never occurred to me that those same requirements might apply to intellectual work, to management, to government, to getting along with people.

And finally, starting with history discourages the common and distracting tendency we all have to define a problem not by the system’s actual behavior, but by the lack of our favorite solution. (The problem is, we need to find more oil. The problem is, we need to ban abortion. The problem is, we don’t have enough salesmen. The problem is, how can we attract more growth to this town?) Listen to any discussion, in your family or a committee meeting at work or among the pundits in the media, and watch people leap to solutions, usually solutions in “predict, control, or impose your will” mode, without having paid any attention to what the system is doing and why it’s doing it.

Remember, always, that everything you know, and everything everyone knows, is only a model. Get your model out there where it can be viewed.

A society that talks incessantly about “productivity” but that hardly understands, much less uses, the word “resilience” is going to become productive and not resilient.

Our culture, obsessed with numbers, has given us the idea that what we can measure is more important than what we can’t measure.

Don’t maximize parts of systems or subsystems while ignoring the whole. Don’t, as Kenneth Boulding once said, go to great trouble to optimize something that never should be done at all. Aim to enhance total systems properties, such as growth, stability, diversity, resilience, and sustainability—whether they are easily measured or not.

Enjoy reading this?

Join my newsletter! Each week I breakdown interesting finance and investing topics. I put in hours of research so that you can spend minutes learning. Unsubscribe at any time.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.