How Politicians Leverage People's Naive Viewpoints

How do we elect such poor decision-makers to office? As anyone who studies our warrior's rules at this site knows, politicians are our most fertile source of examples of defective strategy. This fact raises the question: how do these people get elected to office in the first place?

The answer is that both political parties have discovered that election depends largely on the popularity of two naive mental models of how the world works. The problem is that while these perspective's get politicians elected, when they are put together, they are worse than destructive for making good decisions about policy.

The first of these two worldviews describes the presumed source of our individual successes and failures. Putting the technical jargon of attribution theory aside, we all tend to attribute our successes to our good decisions and our failures to circumstances beyond our control. For example, when we get promoted, it is because we are good at what we do, but when we are fired, it is because of a bad economy, a bad employer, or a bad boss. Let us call this the "naive theory of attribution." Psychologists describe this as irrational behavior, but our warrior's rules recognize it as the beginning of wisdom. We will explain why in a moment, but for now, let us focus for now on how politicians use the naive theory of attribution to their advantage.

The naive theory of attribution raises an important question: who controls circumstances beyond our control? In other words, who is the source of our failure? The politician answers this question using a second popular misconception. I will call it the"naive theory of causality." This theory says that events beyond our control arise from the intentional activities of others who control us. If we are fired, it is because those who control the economy, our company, or our boss are acting stupidly and/or selfishly.

Politicians, of both parties, run on the premise that they will use government is to control these people who are creating problems for us. They promise to stop these people from acting stupidly and/or selfishly. The names of the individuals and groups that need to be controlled change over time, but the need for controlling them remains.

What could be wrong with this promise? Why doesn't it work in practice where, time after time, government fails to control the circumstances that are beyond our individual control?

The answer that the politicians give is either 1) we need more government control, that is, the levels that we have been using are simply not enough usually because they don't have enough money, or 2) we need different government control because those selfish and stupid people have simply found new ways to make problems for us, usually by controlling the opposing party.

The answer from the Sun Tzu's rules of strategy is quite different. From our perspective, the naive theory of attribution simply does not go far enough to explain failure while the naive theory of causality extends that error. The source of both problems is our failure to recognize the two separate realms of control/production and strategy/competition (1.9 Competition and Production).

The naive theory of attribution is not a foolish psychological bias. It is correct in assuming that our success comes from our decisions. It is also correct in assuming that our failure comes from circumstances beyond our control. However, it fails to recognize the difference between two key arenas four our decision-making: areas in our control and areas outside of our control.

Within areas of our control, our decisions create our future directly as a consequence of our actions. In areas outside of our control, our decisions can only influence our chances of success. To increase our chances of success, we must understand our lack of control, using our warrior's rules to leverage the forces of the environment rather than oppose them (3.2.1 Environmental Dominance). Using our warrior's rules, we will succeed more often, but we will still fail, though less frequently because those circumstances with which we are working are truly outside of our control.

If we lose our job, it is because we have a bad economy, a bad employer or bad boss. However, over time, it is our responsibility to for developing our positions that makes us less and less vulnerable to the bad economies, bad employers, and bad bosses. Over time, our individuals decisions can provide us more and more protection from the most common swings of fortune.

This brings us to the problems with the naive theory of causality. Yes, in society, circumstances beyond our control result from human decisions and actions, but this does not mean that everything is under human control. This logical, linear thinking is the product of our education, which ignores the nature of the complex systems that arise from the interactions of independent, adaptive agents (1.2.3 Position Complexity).

No agent in these complex environment, not even the agency of coercive government, can control the results of these interactions. Circumstances result from all our interactions in unpredictable and unplanned ways and no amount of government can change that. Events are an emerging property of interactions, which are inherently too complex to understand. Even very small complex, adaptive systems, such as a chess board, result in more optional positions than there are particles in the universe. Real-life systems are, literally, infinitely more complex, more complex than a universe full of computers can predict, much less control.

Since complexity increase logarithmically according to size, the larger the system we try to govern, the less effective government can be. This is a factor both of the span of control attempted (i.e. federal government is less ineffective than the states, and large states are less than small) and the depth of control (i.e. controlling the economic behavior of everyone versus preventing the physical crimes of the few). Such attempts are not only futile but work against the self-balancing nature of these systems, creating much worse situations than they prevent.