The World’s Most Sophisticated Swiss Army Knife

August 17, 2011

Think how differently you would have to go about filling your needs if you lived on a desert island rather than in a society of men and women.

“[The] thirst for objective knowledge is one of the most neglected aspects of the thought of people we call ‘primitive,’” observed anthropologist Claude Lévi-Strauss.  Living in a state of nature with your survival under constant threat is a powerful incentive to learn the cause-and-effect relationships of the physical world.  Primitive people have to be part meteorologist, part horticulturalist, and part engineer, as did our early ancestors.

Thanks to civilization, each of us personally no longer needs to acquire that kind of knowledge.  We still live in the physical world, and we still have the same basic needs for nourishment, shelter and health maintenance.  Somebody in our social system needs to have the technical knowledge to grow food, fabricate things and cure what ails us, but we don’t.

That doesn’t relieve us of the need to manipulate the environment to fill our needs, though; it just adds a sequence to the causal chain that concludes with their fulfillment.  If we want to make a hole in the ground, we can pick up a shovel and dig one.  Or we can use a different kind of tool: another human.

You need to know how to use a tool in order to get it to do what you want.  Whoever actually digs the hole will need to have the physical ability and knowledge of basic mechanics to use the shovel, or some other implement capable of doing the job.  We’ll call that implement the first-order tool.

If you want to have a hole dug and are unable or unwilling to do it yourself, you could use a second-order tool, another person, to operate the first-order tool.  Just as the eventual shovel-wielder needs to know how to make that tool do its job, you need to know how to make the second-order tool do its job – pick up the shovel and dig the hole.

The significance of being able to use people as tools – of human agency – to bring about the physical effects that support and enhance our lives is hard to overstate.  It’s almost like traveling to a new universe that replaces our laws of physics with its own cause-and-effect rules.

The instructions for our second-order tool might look like this:


Congratulations on leasing the
Homo sapiens 3000 all-purpose tool!

Used properly, your HS-3000 can build you a comfortable house and keep your pantry perpetually stocked, transport you across continents in a few hours, or fill almost any material or emotional need you may have.

To operate your HS-3000, offer it some commodity or service it values – some physical effect for which you are its HS-3000.  Depending on the task, most units will also accept a tradable proxy for that value (money).

Note: Although all HS-3000s are equipped with the same NeuroCogTM operating system and leave the factory with the same default settings, their associative networks can develop significant differences in the field.  As a result, no two units respond to their controls in precisely the same way.  This is a feature, not a bug, and allows you to select the HS-3000 that gives you the best results.

Disclaimer: Each HS-3000 is an independent agent wholly responsible for ensuring that it is operated in an ethical manner within recommended parameters for approved purposes only.  Some units have more robust self-protection circuits than others.  The defeating of those circuits, and other forms of abuse of the HS-3000 for personal gain, are a matter between the user and his or her conscience.


 

We have to work with people’s heads to get them to perform actions that benefit us.  It’s impossible to physically force someone to pick up a shovel and dig a hole, as you would use a hammer or lever.  In order to use humans as tools, we have to get their brains to send a signal to their muscles to move in such a way as to cause the physical effect we seek.

To do that, we need to understand cognition.  We build mental representations of the world that encode our assumptions about its cause-and-effect relationships, and allow us to “sandbox” actions we’re considering – try them out and see what’s likely to happen.

The informal word for mental representation is “belief.”  Whether you believe that Bigfoot is real, that cell phones cause cancer, or that Google is making us stupid, those propositions are slices of the mental models you build to represent the world and the rules by which it operates.  When someone asks (or orders) you to do something, you consult your mental models and calculate what’s likely to happen if you do it, and also if you don’t.

We steer cars by turning a wheel.  We steer people by appealing to and manipulating their beliefs.  To make our human tools work for us, we leverage the entries in their mental lookup tables about what causes lead to what effects.

We could direct their attention to the consequences of failing to do what we want them to do.  That can include undesirable effects that we personally promise to cause (I’ll punch you; I’ll sue you; I’ll withhold sex; I won’t recommend you for promotion).  Or we might invoke other supposed cause-and-effect relationships to get them to do our bidding (Congolese orphans will starve if you don’t; God will hold it against you).

For the purpose of making our second-order tool do the work we’ve assigned it, what matters is not what really will happen as a result of their choice (our threat to punish them could be a bluff; God may not exist, or may not care), but what they believe will happen – what causal rules are coded into their mental models.  If the rule we seek to leverage is already there, so much the better.  If not, we can try to program it in – to convince them that A leads to B.

Most of the time, we operate our human tools using a carrot rather than a stick.  In return for manipulating the environment to cause an effect that fills a need of ours, we offer to do the same for an equivalent need of theirs, either directly, or using a substitute for value, money.  You scratch my back, I’ll scratch yours.

But the equivalence of that exchange depends on how accurately our information processing systems link causes with effects.  For example, we’re fairly confident that putting a liquid made from the fossilized bodies of ancient plants and animals into our cars makes them move, so we generously reward those with the knowledge and ability to extract it.  But we also spend a lot on additives that claim to improve gas mileage, despite repeated scientific tests that find virtually no benefit to them.

Then there’s the herbal and dietary supplements industry, to which we trade tens of billions of dollars’ value of our work annually for health benefits that are either non-existent or so small that they’re overwhelmed by other effects.  The fact is, misattribution and the placebo effect account for much of our evaluation of the effectiveness of the goods for which we swap our labor and skills.  There’s nothing in Coke’s formula that makes us “open happiness” by snapping off the cap; “scrubbing bubbles” is just a marketer’s metaphor to boost sales of a bathroom cleaner whose actual performance is no better than its competitors’.  The word “organic” on product labels may well account for the greatest amount of work in history traded for benefits perceived above and beyond those objectively caused.

It’s not just to make products that we use people as tools, but also to cause other effects we desire.  We rent CEO tools to turn struggling companies around and politician tools to create jobs and defuse world tensions.  Here, causality can get really complicated.  For one thing, we’re using them as n-order tools in a causal chain: they act on the people who report to them, who in turn manipulate their staffs, and so on, until, if all goes as planned, the effect we seek (solvency, peace) pops out at the end.

And this kind of tool often operates inside a large system in which a variety of forces (some controllable, some not) interact in complex ways to determine the outcome, such that it can be impossible to reliably calculate one person’s contribution.  When causation is complicated, it’s easier for plausible but false explanations to “pass.”

All this doesn’t mean that we can rip each other off at will.  Even though the saying, “The truth will out,” is false as an absolute, we can jigger belief only within certain limits.  Our fondness for citing “The Emperor’s New Clothes,” though, shows that the wiggle zone can be pretty wide, and the fact remains that all we have to do to make our human tools work for us is get them to believe we can create an effect they desire, even if all we’re really providing is a placebo.

Sum all of these causal attributions across a population, embed them into their institutions and practices, and you’ve got a new kind of reality: social reality.  Status, power, the distribution of social rewards and sanctions – it can be argued that all are ultimately based on collective inferences about causality that are only loosely correlated with what really causes what.

That disconnect isn’t going away anytime soon, because it’s a consequence of the cognitive biases, errors and capacity limitations that are our evolutionary legacy.  What’s a second-order tool who doesn’t want to be taken advantage of – or a user of second-order tools who wants to get the most from them – to do?

How well an organism thrives depends on how well it masters its environment’s causal rules.  We live simultaneously in two different worlds, one physical, one social, each with its own rulebook.  Many, if not most, of us are better at knowing and manipulating one of those causal systems than the other.

So we go with our strengths.  If you’re good at knowing the objective (physical) world, you can get the best value for your work in occupations with low tolerances for error in working with its causal rules, such as astronaut, farmer, or computer programmer.  People with good social and persuasive skills can maximize the return on their labor in careers that depend more on – and allow more wiggle room in – the construction and manipulation of beliefs.  Examples include salesperson, coach and elected official.

We’ve come a long way from the African savanna.  Civilization didn’t overturn the laws of physics, but it made their direct effects less important to the quality of most of our lives than the effects that flow from each other’s interpretation of them.  Human agency is a game-changer.

Share

Leave a Comment

Previous post:

Next post: