Western Culture: The Baby and the Bathwater

Since we live in a postmodern world, we often forget the “roots” of Western Culture. Oh, we know lots of incidental details, and sure enough, we have a pretty good idea of how things work.

What we often lack, however, is a clear idea of how we got here.

When I was in university, I had to take a series of general education courses. In GenEd 101, we read excerpts from The Bible, The Merchant of Venice, Plato’s Republic, and all sorts of other good things.

The books we were reading for that course were, as I am sure was obvious to the professors, important source material for Western Culture. What’s interesting to me now is that I somehow remained blissfully unaware of that fact. I understood that these books were important. I just didn’t understand why.

To tell the truth, it wasn’t until I was in my late thirties that I really began to be able to wrap my mind around Western Culture. Sure, I knew that Western Culture is racist (for instance), but I didn’t understand the why of it. I understood that the more abstract the material one studies in academia, the more prestige one gained–but again, I didn’t even know how to ask “why?”

What I had difficulty understanding was that racism, the valuation of the abstract over the concrete, and a whole host of other ideas all  had the same root: they could be traced back to Plato, or at least to Neoplatonism–ideas I’d been exposed to in my education.

Playing with Plato

This is not the Plato you are looking for.

In the works of Plato is the idea of the Great Chain of Being. The Great Chain is a hierarchy of all things in the world. “Higher” in the chain is the ideal world, and things are judged by their closeness to that ideal. Highest in the chain is “the One.”

When we use the Great Chain to judge things, and Western Culture often does, we value the abstract over the concrete. The abstract is closer to the ideal, and the ideal is closer to the ultimate ideal–the One, which was represented as God after the popularization of Christianity.

Tickle Me PoMo

What’s interesting to me about the last thirty years is that, as we’ve moved from a Western Culture to a Postmodern Culture, we’ve lost this valuation of the abstract. No longer are those who professionally study the abstract held in high regard. Instead, as a culture, we have moved towards valuing things based on their “utility” in a much more limited sense.

The capital-“T” Truth has moved from being an ideal to something that we value only for its utility.  The idea, and ideal, of pure research has fallen away. Oh, I’m not saying that it doesn’t happen. I am saying that the ways in which it is supported have grown more limited over time.

Kevin Clash and Elmo
No, not ELMO…

These ideas have become popularized, as well. Maybe you’ve heard the saying, “those who can’t do, teach.” It’s one of those catchy sayings that appeals to students everywhere because, let’s face it, students are sometimes working out their own relationships with authority.

But the aphorism, in denigrating the teacher’s role, tosses aside two millennia of Western values. What could be more postmodern than that? It is part of a larger belief, we live in a new era where things simply are what they are, and everything can be bought and sold.

Once, I would have shaken my head and said that students have always rebelled against teachers, and it’s nothing to worry about, but this attitude has somehow become trendy recently. The standard bearers of our culture, pundits and politicians, talk about how teachers have such an easy life, and how they have such poor standards, and how what they teach is of limited utility.

[Irony alert: These are often the same people who decry the loss of “Western Values.”]

Don’t Turn Back the Clock

Caste System

We don’t need to turn back the clock and start using the Great Chain of Being again in order to redeem (or resuscitate) Western Culture. But we’ve tossed aside one set of values, and seem to be struggling to come up with an alternative.

One possible alternative that we’ve come up with is determining someone’s value as a person by how much money they make. This is in contrast to traditional Western patterns, where we’ve built value through three separate hierarchies.

For example, we’ve been promised that a college education will provide the means for us to join the middle class. That is an example of educational prestige. Yet that the promise of a college education doesn’t ring as true as it once did.

Yes, a college education is often a necessary step for those hoping to enter the middle class, but in recent years the middle class has shrunk while the number of granted college degrees has risen. Just do the math, and it’s easy to see that the middle class is neither growing nor prospering, while the number of people getting college educations is rising, as is the cost.

Apparently, Western Culture has decided it doesn’t like the pattern described in Max Weber‘s Three-Component Theory of Stratification anymore. We’re moving toward one rationalized hierarchy; instead of wealth (cash), prestige (education), and power (force), we decided that prestige and power could be merged with wealth if we simply paid people according to their social status.

While seemingly more “efficient” and “fair,” this has had some nasty side-effects. The previous tripartite system described by Weber, with its multiple paths of “success,” served to prevent any one of these hierarchies from dominating.

By attempting to merge the three social-status systems into one, we’ve managed to increase the amount of distance between the top and bottom. Additionally, we’ve limited the number of metaphorical ladders one might climb when attempting to raise social status.

New Western Values

Culture isn’t some abstract list of values, it’s a series of practices passed from generation to generation, and tested over time using large-scale feedback loops. In this period of intensive cultural change, the “new” values we hold haven’t had time to be tested.

The “old” values have been tested, and we’ve found it necessary to discard or modify many of them. All we need to do is  think about the significant changes in how we view race, gender, religion, and other aspects of culture, or how much the meaning of the word “work” has changed, or even “education.”

The cultural upheaval that we’ve been going through since the 1950s (though in truth, at least since the Industrial Revolution) has been met with our attempts to build (or agree upon) a new core* for Western culture. We struggle with this because we feel the new is so different from the old.

We’re laboring under a false assumption. The “core” cannot be divorced from the past, as postmodernism sometimes seems to be. The core we create will, however, need to incorporate a larger world. The past and the present are not at war. They are in deep and dramatic negotiations over our future.


Truth Vs. Trend

Spy Vs. SpyThere is often a difference between what “everyone” believes and the truth. Some of social science models of behavior include assumptions of feedback loops that allow us to quickly modify our behavior to be in tune with the reality of situations.

Feedback loops take time, aren’t 100% accurate, and are subject to manipulation. Even pursuing the truth is (for many people) only as useful as long as it serves other goals. And sometimes pursuing a commonly held untruth is more personally profitable.

In other words, the truth isn’t sexy.

Feedback Loops Take Time

When we believe things that aren’t true, and we either share or act on that information, eventually the world will let us know. That’s a feedback loop. But it’s important to recognize that feedback loops aren’t instantaneous.

To start with a simple, even silly example:

A cup of coffee, black
I take my menko black.

Let’s say I wake up one morning believing that coffee’s not called coffee anymore; it’s called menko. I get up and make myself some menko: no feedback since I don’t even read the label.

Next, I’m talking to some people and I mention that my menko is great, and that I really need my menko in the morning. Sure, these people think I’m weird as heck, but they might not say anything. They might just think, “Smile and nod at the strange man waving a coffee cup.

Worse, the people I’m talking to might figure it out from context, and give me feedback, “Yeah, yeah, I need my menko in the morning, too.” Maybe they even think it’s a brand name, or something.

Finally, I head to a coffee chain and ask for a small decaf menko with soy milk. Maybe, just maybe, this kind stranger will let me know that I’m using the wrong word. It’s their business, and it’s in their interest.

Eventually, someone will give me feedback: the word is coffee, not menko. But that takes time, and it takes someone willing to make the effort to correct my mistake.

Some Feedback Loops Are Bigger Than Others

On a larger scale, the social feedback loop is pretty important for checking information. Most people, normal people, don’t do independent research on everything. Instead, they look for social feedback on “facts.” As long as our beliefs are common enough to “work” for our purposes, or at least don’t get in the way of our goals, we’re not going to get feedback to correct them.

Bransfield Iceberg
How about some iced menko?

The menko example is pretty silly, and it’s a situation where negative feedback’s going to come pretty quickly. But what about larger questions, like “Is there such a thing as human-caused climate change?”

Climate change isn’t a topic that really fits well with social feedback. All that will tell us is whether our friends believe that the world is getting warmer. If, instead of relying on specialists (you know, the science-y folks with their labs, samples, and data), we subject the information only to social feedback loops, we’re going to have to wait until things have gotten so bad that nearly everyone has a terrible story of climate change that they want to share.

Most people don’t use scientific data in their everyday feedback loops, relying instead on existing connections–social, political, religious, and so forth–to inform their opinions on topics. In most situations, this doesn’t matter all too much. The data we receive socially is good enough, especially as we primarily act socially, anyway.


Sometimes, the acceptable “facts” are at odds with the actual facts. This happens all the time. Let’s say that we work for a company, PerkyCorp, which prides itself on having an awesome corporate culture. It’s always a party and everyone gets along. But somewhere, hidden in the recesses, something is wrong.

“I’m Cassandra, the Archetype of the Bearer of Bad News, and I approve this message.”

While everyone at PerkyCorp is busy getting along awesomely, and getting promoted for being good team players, decisions might be getting made that are taking the company off the rails of competitive success. What? How?

If everyone’s busy getting along, and getting rewarded for it, there’s no advantage for them to close the feedback loop. Now–there are two sides to this:

  1. Being part of any group, from a  business to a political party, involves a certain amount of “drinking the Kool-Aid.” As long as things don’t get too bad, the social-feedback loop is going to be the primary method of decision making. If the Kool-Aid tastes awful, those who can smile while they drink it will rise to the top.
  2. People who choose to address burgeoning problems do so at their own peril. Making changes in the group, assigning responsibility, and generally being hierarchical is seriously un-fun.

Solving problems that a group doesn’t know they have doesn’t make someone a devoted employee with the interests of the company at heart (even if they are)–it makes them a Cassandra. Being a Cassandra with data and proof isn’t good enough, either. If PerkyCorp can’t see that the feedback loop’s closing around their collective necks, telling them isn’t going to help.

In short, if the PerkyCorp’s GroupThink isn’t telling them that there’s a problem, trying to solve the problem isn’t saving the day, it’s harshing their buzz.

Why Is Cultural Anthropology So Hard?

A portrait of a monkeyThat’s right. Cultural anthropology, the study of humans across cultures, is hard.

But how can that be? After all, we all know people and interact with them every day. We all understand the people around us, more or less.

Anthropology’s not hard in the same way as math, or medical school, or engineering. Anthropology looks at a different kind of data. It requires a different kind of thinking.

Anthropology’s challenge doesn’t come from its esoteric subject matter. It comes from taking something so familiar that we think we know it, and really laying out what we know and what we don’t.

Anthropology takes the familiar and begins to examine it closely and carefully. It takes those underlying assumptions, like “people always act in a self-interested way,”* and examines them to see if they’re true. More than that, it asks if that’s true of people the world over.

Why Does Anthropology Get a Bad Rap?

Anthropology is a social science that asks tough questions, makes tough calls, and tells people that what they always thought was true is really just true some of the time.

Take the concept of “race” for instance. We’re all taught that “race” is true, right?  Race is “real” and “natural.”**

Well, as it turns out, if you shine the light of science on the darn thing, race turns out to be a cultural construct. Now, that doesn’t make the struggles and injustices people have faced unreal, it just means that it doesn’t have to be true. There are other cultures with other ways of seeing “race,” and there’s variation in our own culture, too.

When people get faced with the idea that their own culture, like all other cultures, is just a system of thought, rather than the bedrock of reality it seems to be, that’s scary–sometimes downright terrifying–on a cognitive level. We use our own cultures to function in the world, and it’s an unpleasant feeling to realize that they’re not True…they’re just “good enough for government work.”

How Should We Apply Anthropology?

Anthropology asks the unaskable, challenges the unchallengable, and moves us closer to reaching the root of the question, “Who and what are humans?” Where other disciplines might declare the answer to this, anthropology takes those answers and asks, “is this true in all times and places?”

But we can’t all just spend our lives on the big questions. Training in anthropology gives a certain kind of insight into people. It gives us a more honest view that is at the same time powerful and uncomfortable.

Anthropology, as a discipline, works very hard at using this perspective for the power of “good”–even while recognizing that such “good” is probably a cultural construct itself. The American professional association for anthropologists (AAA) has put out statements against the use of anthropology in assisting the U.S. military in their work. Further, anthropologists are often leery of using what they (we) know to assist marketers, or otherwise use our perspectives to promote intra-cultural ideals.

Anthropologists often study dis-empowered and under-served populations, and handing the keys to those cultures to people with lots of pull, and widgets to sell, is a tough ethical decision.

At the end of the day, we need to engage culture (even, and especially, our own) so we can try to high-mindedly try to change it, or even draw a steady paycheck.

That being said, what is the proper role of anthropology?

*This is the assumption of the rational maximizer in economics.

**  As someone trained in anthropology, even writing this sentence pains me. On the other hand, if you believe we live in a post-race world, then please sign  up for Anthropology 101.

Under the Thumb of the Invisible Hand

Adam Smith
Adam Smith (1723-1790)

The metaphor of the “Invisible Hand of the Market” was first introduced by Adam Smith in The Theory of Moral Sentiments (1759). It was further developed in An Inquiry into the Nature and Causes of the Wealth of Nations (1776). It metaphorically attributes a “will” to the market, which seems from the outside to make economic decisions about factors such as pricing.

The “Invisible Hand” metaphor is invoked when people discuss the decisions that seem to be made by market forces. It says that as people acting in the marketplace attempt to maximize their own profits, this has an overall self-regulatory effect.

For example, as a price for a commodity goes up, fewer people will be interested in buying it. This will then bring the price back down. Thus demand will regulate shifts in price. The “Invisible Hand” is meant to describe a “perfect” unregulated market.

Descriptive, not Proscriptive

It’s important to recognize that Adam Smith was a moral philosopher–a philosopher and scientist. He wasn’t telling capitalists what to do, but describing their already existing behaviors. Using Adam Smith’s words to justify questionable economic behavior is therefore a category mistake.

Worse, it’s like using the sixth-grade excuse “everybody’s doing it.” Smith is describing the actions of an unregulated market–not suggesting that it would be really cool if we had one.

In scientific inquiry, some variables need to be held steady so that others can be measured. Metaphorically, Adam Smith held the variable of “human nature” steady by declaring that all people are inherently “rational maximizers“–that all people, under all circumstances, seek to maximize profit.

In Smith’s work, profit is considered the a priori goal of humans: humans serve the market, not the other way around. Yet we know from everyday life that acting in this way is a choice, not the singular, defining human drive. Economists may claim that people are simple and driven by one goal, but marketing experts certainly know better. People make economically irrational choices every day, and for a host of other reasons.

The Honey Badger of the Market

Because the “Invisible Hand” is an aggregate of human choices, it bears similarity to mob mentality and groupthink. In other words, people in groups will do things that no individual would choose to do openly.

Sometimes we see people (often politicians) suggesting that the “Invisible Hand” will take care of such problems as a poor economic situation. They argue that if we would just free (deregulate) the market, it will create wonderful growth, solve poverty, pay off the national debt, etc.

But here’s the part people either don’t mention, or don’t understand: the free market doesn’t care about individuals. Much like other forms of mob mentality, the free market doesn’t care if people live or die, if they have enough to eat, a place to sleep, or even rule of law.

The perfect “free market” is like the honey badger. It doesn’t care about forms of government, stability, raising the next generation, deforestation, murder, war, or much of anything else. The free market doesn’t care if 10% of the population is wiped out, as long as it doesn’t affect profits. And if it does affect profits, the market will correct–but not until after the people are dead.

The “perfect” free market is predicated on one value: maximizing profits. All other values are seen through that lens. So, when someone gets up and says that a free market will “solve problem x” through deregulation, ask if all of your values are in line with the market’s single driving goal.

Fear of the Dark

Why do we fear the dark?

The fear of the dark is the fear of the unknown. Although we think of it as a primitive–even primal–fear, the true culprit isn’t the monkey-mind, it’s the rational mind.

Weird Tales September 1942
Where do we learn to fear the night?

As humans, we mostly use language to understand the world around us. Using words as tools, we look at situations and match them up to preexisting schemas. Thus, the mind maintains some semblance of control, at least over itself.

But sometimes we’re confronted with darkness. Suddenly, our rational minds lack the contextual clues we normally use to order our thoughts and perceptions. The part of our minds that “keeps it together” finds itself at a loss, beyond its comfort zone.

The mind, confronted with a sudden lack of data, starts to tell us wild stories based on two things: small clues in the environment, and also all the other “myths” (learned–and sometimes fanciful–cultural schemas) of what can happen in the dark.

“What was that noise?” we ask. The “I don’t know” we get back forces us to rely on our instincts–instincts that are often unfamiliar, rusty, and awkward. We feel fear and a loss of control. That fear then drives our thinking. With a lack of external data, the internal state of fear causes our rational self to come up with explanations for the fear. These dangers, though imagined, increase our fear in a feedback loop.

Two Kinds of Fear

There are two kinds of fear: the fear of the known and the fear of the unknown.

Military dog barking
Sometimes, we have good reasons to be afraid

When we are confronted with a big, angry dog that is snarling, barking, and working itself up to attack us, then that is a known fear. Yes, the dog is scary, but there is something that we can at least try to do about it. We can run. We can fight. We can call for help.

But what if we are confronted by darkness, by nothing? In that case, the rational mind, lacking necessary cues to figure out what the heck is going on, fails us. It spins out of control, looking for ways to explain the feelings of fear. Maybe, it whispers of screams, there is an angry dog, or a monster, or a predatory human…pretty much anything scary. Our schemas fail from lack of data. We are forced to rely on the instinctive parts of the mind.

Cave mouth on a sunny day

When confronted by nothing, which is really what the fear of the dark is, the confounded rational mind dumps us down to the instinctive mind. The rational mind casts about for possible data,  but finds little. That’s why we get irrational when confronted by the darkness. We’re suddenly reliant on a part of ourselves that is unfamiliar–that primate that we carry around, but mostly ignore.

Our fear of the dark reminds us that we’re not just a rational mind riding around in a body. We are a combination of mind and body, rational man and instinctive primate, each influencing the other. We might not like it, but that doesn’t make it less true.

Wherefore Anthropology?

What are students supposed to learn in an intro class on culture? This question should be fundamental, but it sometimes seems that it gets lost. Maybe it’s too fundamental?

Bronislaw Malinowski in the Trobriand Islands (1918)Unlike, say, 150 years ago, we (the bulk of Westerners) don’t spend our lives isolated from the “other” in agrarian communities. Before most of the population moved to cities, we spent our time with like-minded people and married those of similar culture — language, social status, race, and religion.

A bare five generations later, things are now very different. Each and every day, we’re exposed to people who belong to other ethnicities and cultures. We interact with people who have different beliefs. Where difference was once rare and exotic, now it’s incredibly common.

Intro to Anthro

On the head of that, we can hardly say that the purpose of an introductory anthropology class is to expose us to difference. We’re constantly exposed to the “other.” Turn on your TV, open a webpage on the Internet, or just go out on the street and look around. The “other” isn’t somewhere in far-off lands overseas; it’s right here, and always around us.

That doesn’t mean that anthropology is irrelevant; it just needs a new elevator speech. Anthropology doesn’t just teach us that difference exists, any more than math courses only teach us that numbers exist. Anthropology teaches us new ways of thinking about difference. It goes beyond “difference is okay” and actually leads us to new mental processes for dealing with situations that are outside of our own cultural expectations.

In other words, introduction to anthropology, world cultures, and similar courses don’t simply give us information about other cultures. Instead, they model new mental processes. The purpose of these courses isn’t to tell us that some people in the world practice polygamy and other such “strange” practices. Anthropology teaches new ways of thinking about aspects of our own cultures and, by extension, new ways of thinking about people.

But That’s Weird

Anthropology is not just a subject; it’s a discipline. It doesn’t just give us new things to think about. Anthropology teaches us new ways to think about things, or more specifically, people.

When we’re raised, we learn about our own culture in a process called enculturation. We accept what we learn as some kind of fundamental truth — it becomes our baseline of thought. As we get older, and learn more about the world, we see that it’s a much bigger place than we were taught. That’s not surprising, and it’s a normal part of maturation.

Still, when we’re confronted with things that disagree with our initial enculturation, they cause cognitive dissonance. We see something “weird” and think “that’s weird” or “that’s cool.” But that’s usually as far as we go.

What anthropology teaches us to do is to step outside of our own culture for a moment. Without anthropology, we see difference and recognize it, but that’s often the end of the matter. Anthro teaches us to follow the difference and see it in its own context.

As it turns out, the little differences we see aren’t just small, isolated individual things. They are part of whole systems of difference that, taken together, make up entirely different ways of seeing the world — cultures. Anthropology isn’t just about knowing disparate facts about peoples around the world. Anthropology is about being able to understand entirely different ways of thinking.

Cosmopolitan 2.0

Does that mean that each and every student who completes Introduction to Cultural Anthropology should be able to engage in the mental gymnastics necessary to not only think outside of their own culture but also develop mental frameworks for understanding these patterns of thought?

The short answer is “no.” Being able to do that takes years of training and professionalization. So what can we hope students will learn? What’s the first skill that anthropology teaches?

NYC subway riders with newspapers
Difference is one of the fundamental experiences of city life

Since the time of the first cities, people who lived in them were more exposed to others of different cultures than those who lived in small-scale groups. The words “cosmopolitan” and “sophisticated” might seem outdated, but  the ideas behind them aren’t — they’re more relevant than ever.

When students finish an Intro to Anthro class, they should be better able to take the differences in the wide world in stride. When confronted by difference, they should be able to think rationally about culture and recognize people’s fundamental humanity.