Category Archives: Anthropology

Race: Cultural Construct — And Very, Very Real

Anyone who has absorbed the content of an Anthropology 101 class, or even a general education class, knows intellectually that race is a fiction. What we don’t usually take away from these classes is the nature of “fiction.”

We think of “fiction” in the most simple, 6th grade terms — as “made up.” But as adults, it is easier to recognize that that a fiction is not the same as a lie — or even more simplistically “a random collection of disparate ideas.”

10,000 Monkeys

Shakespeare
Better than 10,000 monkeys

But Shakespeare was not written by 10,000 monkeys on typewriters. Fiction is not random. In the same way, race does exist — as a part of culture.

The part of culture that we struggle with, especially our own culture, is that it is not coterminous with reality. But one thing that we know, as anthropologists, is that culture seems real. From the inside (and humans must by definition operate from the inside) we can’t tell the difference between reality and our perceptions of it.

The “crime” is not that we believe in race, but that we legislated it as if we bear the unitary truth of reality. The mistake, then, is not that our culture is “wrong” but that our expectations of culture are crazy.

A Teaching Moment

Saint Francis of Assisi by Jusepe de Ribera
Just another hotblooded Italian? No, that’s Saint Francis of Assisi.

When I was teaching Anthropology 101, I had a student in one of my classes who felt, very strongly, that his identity as an Italian-American was all the explanation that was necessary for his hot temper. No amount of argument would suffice to show him that what he was doing was embodying and reinforcing his culture’s expectations.

The discussion in class became quite heated, as he would not budge from the belief that his experience was not only valid, but indeed intrinsic. That, of course, is the power of culture.

The power of anthropology is not that we can step outside of culture and look at it. That is, I believe, impossible. We can bend and expand our culture, stretching our mind to extrapolate the nature of culture. But we can never wholly step away from it.

Anthropology: Proscriptive or Descriptive?

Anthropology is an academic discipline, and philosophically a powerful one. It allows us, from the inside, to stretch and turn our culture so that it can look at itself.

But we also know that it is possible, once we have begun these philosophical limbering exercises, to reshape not only ourselves, but culture as a whole. Like all science, anthropology lets us begin to reshape our minds in ways that let us begin to map the world the way it is.

But this idea, that we can reshape our culture to map the world, is based on the underlying cultural belief that culture and reality are the same thing. Believing that everyone should have an understanding of the world that matches an anthropologist is just as self-centered as atheists (for an example in the news) who believe that everyone should see the world the way they do.

All at once, it brings anthropology to the center and cheapens it. If anthropology is itself a discipline, then it has something to offer the world as it is. We don’t need everyone to be an anthropologist any more than we need everyone to be a doctor.

Instead, anthropology might consider focusing on what these truths we learn mean when we can present them without leading the listener to be an anthropologist themselves.

And if it is impossible for us to teach without leading the listener through every step of the whole process — if we can’t bring truth back for our culture — then that would explain the challenges we face as a discipline.

The difference lies in status. If anthropologists can access cultural truths and these truths are of some use, then we must teach from a rhetorical standpoint of ethos. If our only way of teaching is teaching others how to think, then we are in the business of changing culture, not serving it.

Race is part of Western Culture, and denying its cultural validity is an uphill battle. Instead, we must reach out to those who legislate and make decisions that cross cultural groups. These are the people who need to understand that everything they “know” about “the other” is from only one perspective.

Advertisements

Holidays — The Best Christmas Gift

Santa's Arrival
Sure, we all like the parties and the presents. But sometimes we take the days off for granted.

If we go back in time not too far, just a couple of hundred years, then we can be certain that much of the West was Christian. This wasn’t just a matter of ascription, as religion and the law were strongly intertwined. In other words, it wasn’t that people were Christian, so much as the whole culture was from top to bottom — but especially at the top.

Unsurprisingly, the word “holiday” originally came from the Old English hāligdæg meaning “holy day.” That is, originally the days were all religious observances.

Not only did people “get” the day off, they were required by law and religion to not work. And maybe more interesting, the idea wasn’t so much about partying as performing religious observances.

Days Off

The whole idea of not working on holidays comes from its religious origin. Going back to Christianity, one of the rules is “keeping the sabbath day holy.” In the Middle Ages, this was extended outwards to other days of religious observance. Apparently it was used to good effect by the church in making wars more difficult.

Since, in many ways, the Church only had power over “holy” matters, they made an end-run around secular authority and would declare certain days holy so that they could make the warlords stop fighting. That is, they would take certain “feast days” into their sphere of control so that they could stop, or at least slow, wars.

That, combined with the idea of keeping every Sunday work-free, has come down to us as government-recognized holidays. So we can thank the Church’s interest in preventing useless bloodshed for the days off we now enjoy. Enjoy your days off, and Merry Christmas!

It’s Beginning to Look a Lot like Christmas

The War on Christmas is a shorthand for one front in the West’s “culture wars,” which address a number of cultural bones of contention. The central question of this war could be framed “Is the West primarily Christian?”

If we think about Western Culture as the thread of belief and knowledge that has come down to us through history, stretching from the Greeks and Romans (and the Ancient Egyptians, though they got short shrift in my middle school history books, and probably yours), then the West is Christian.

Historically, the Christians fought a battle for political supremacy in the first handful of centuries CE. They took over the Roman Empire and set the religious and scholarly tone for our culture. In short, they won.

The victory of the Roman Christians for political influence meant that power resided either in the hands of the Church (or one of the divisions that came after the Protestant Reformation), or in the hands of people at least nominally beholden to the Church. That lasted, in many ways, until the end of the European colonial era.

The West isn’t the West Anymore

Since the end of World War 2, however, something else — something critical — happened to the West. It stopped being the home only of Westerners.

Starting with the Age of Discovery, which began around 1492, there was increased trade and influence passing back and forth between the West and the rest of the world. For Europeans, the world got a lot “larger” as everyday people came into contact with thoughts, ideas, and people from around the globe.

The European colonial era came to a dwindling end around the close of World War 2. What arose in its place was a postmodern world, where the flow of people and ideas became more and more rapid, and where people of non-European descent could “take their place at the table.”

Human Rights Are for (All) Humans

Sure, there were many non-Westerners in America before the mid 1900s. But those groups lacked many fundamental rights. They were not, legally or culturally, seen as equals. They lacked many of the freedoms enjoyed by nominal Westerners.

For instance, in many cases, non-Westerners in America could not vote, have full citizenship, own land, or marry as they chose. More important for consideration of Christmas, their religions were not recognized.

However, since the second half of the 20th century, a new spirit of religious freedom has enveloped America. It’s an era when the ideas of human rights, which had historically been only applied to white Westerners, were applied more evenly. Non-Western, and non-mainstream Western, populations now have an expectation that the separation of Church and State means something.

The War on Christmas

So it comes down to this: the “War on Christmas” is actually a push to make human universal rights, well, universal. It isn’t a silly argument, though. It’s a full-blown struggle to define “the West.”

Is Western culture just the Christian descendents of an ancient line of thought? Or is it more beholden to these new ideas of equal freedoms for all?

Star Wars Isn’t Racist

Today I went to the San Jose Tech Museum, where I saw the Star Wars exhibit on its last stop of its tour. While it was a very cool exhibit, one thing that I (anthropologically trained) really noticed was the treatment of the “other” in Star Wars, the mythology of my generation.

Science Fiction or Fantasy?

Storm Troopers on Patrol
Storm Troopers on patrol at the Tech Museum of Innovation — San Jose, CA

To understand the impact of Star Wars on our impressionable young minds, we first need to dispense with the idea that the franchise is science fiction at all. Sure, they have laser blasters, robots, radios, and space ships, but it’s really fantasy (…in space!).

How do we know it’s fantasy? Because against the fairly dystopian cyberpunk background, we’ve got good guys in white, bad guys in black, glowing swords, and magic powers blooming left and right. And let me say this clearly: if we have good and evil wizards battling it out in the distant past, it’s not some vision of our future.

Science fiction, as a genre, is one of the ways that our culture tries to make sense of rapid cultural and technological change. We’ve been experiencing this change since the beginning of the scientific era and industrial revolution. But that’s not Star Wars.

Star Wars is about our colonial past, the nature of good and evil, the proper role of mysticism, and necessity of righteous rebellion against tyranny. In other words, it’s not a projection of our future, but a mythic retelling of our own past.

The Mythic Past as a Window to Our Own Past

The Millenium Falcon
The coolest cat in the universe clearly needs the coolest car…spaceship…whatever.

Back in the 70s, when science was “gonna change the world,” Star Wars gave us a chance to see something that was quite the opposite of the Humanist and spiritually sterile Star Trek. But the differences don’t end there. While Star Trek treats all aliens as foreigners with their own political interests and idiosyncrasies, Star Wars treats them as archetypes of our own culture.

In other words, George Lucas might not be racist at all. But he’s betting the bank that we are. And, historically speaking, that’s a safe bet to make.

Only a hundred years ago, the British Empire ruled a big chunk of the world, “race” was a dominant political-economic creed, and everyone “knew” that religion was true and culture was an a priori category. We’ve come a long way in changing our views about matters of race, religion, and gender. Heck, we’ve come a long way since the 70s.

Defining the Problem

Tusken Raider
Extremely territorial and xenophobic, Tusken Raiders will attack with very little provocation.” Little provocation after the humans took their planet, that is…

We live in a culture that would be unrecognizable to the people of a century ago. But that doesn’t mean the change is over. We still understand these archetypes, use them in our stories, and to a certain extent keep them as part of our culture.

Just as science fiction plays with the cultural change wrought by technology, Star Wars as science fantasy deals with our own changing culture. The aliens aren’t really meant to be aliens, just people in funny suits with motivations that we can quickly apprehend and use to drive the plot forward.

You know, archetypes. Stereotypes. We can get all jumpy about the way Star Wars uses these cultural shorthands to paint a quick, recognizable picture. But the problem doesn’t lie with the authors. The issue isn’t that Star Wars is racist. The problem is that we are.

Investing in Education

The promise of a college education has long been entrance to the middle class. What we’re learning however, in a painful and indelicate manner, is that not everyone can be middle class.

By making a college education more and more affordable, what we’ve done instead is make the competition to enter this hallowed group of middle managers more competitive, more vicious, and overall not everything we were promised.

We Pay for Infrastructure through Taxes

Atlas Shrugged
Can we refuse to carry our own weight in our culture? That’s the tragedy of the commons.

It’s always been true that we want to use the infrastructure (sometimes inaccurately described as “services”), but we really don’t want to pay for them. In economics, this is known as the free rider problem. When everyone uses a common resource but no one wants to maintain it, that’s called the tragedy of the commons.

To a certain extent, this tragedy always happens. It is one of the inefficiencies of any semi-capitalist system.

The problem is exacerbated in education because it is only partially treated as infrastructure. Because it is of economic value to users, as individuals, everyone wants to take advantage of it, but no one wants to pay for it. Even more so, no one wants to pay for its general use — just their own.

Funding Education

The larger question, then, is “to what extent should we make education part of the infrastructure (and therefore fund it)?” Insofar as it is designed to benefit the individual, individuals can pay for it. But as far as it is a necessary part of our culture, we need to come together and pay for it.

We already do pay for education through taxes up to the high school level. And while we do not pay for education past that level, we do subsidize it. These subsidies take the form a student loans and other tax breaks both for educational institutions and those paying them.

So there is a recognition on some level that this is, effectively, a “capital” investment in our future. In other words, we don’t think that education is a commodity. It is a necessity. Educated people are good for our country, good for our culture.

But there has been a trend, probably at least since the 1980s and maybe much earlier, to move education away from the model of the not-for-profit, and toward a business model. That’s one reason that administrators are so very well compensated for their hard work.

Education As a Capital Investment

Somehow, an idea has grown up that we get educated for our own sakes. It’s some Ayn-Randian, individualistic idea that we invest in our selves in order to make more money later. While there have always been schools like that, they’ve traditionally been trade schools.

The mission of not-for-profit education never was to turn out workers capable of functioning in a rapidly changing world. That is, certainly, one of their goals. It is their short-term “value added.”

But education, as the pursuit of knowledge, is more than just a way to train workers for this generation. Lost in the economic model of the corporation, we can best think of it as basic research and development on a cultural level. There is value in every liberal arts major, no matter how much the media snickers.

The Great Recession

One of the big effects of the recent economic downturn, the Great Recession, has been a trend away from education for its own sake. That is a movement away from the tradition of education for the middle class.

A core middle class value for a hundred years or longer has been education for its own sake. The middle class, as the middle managers and specialists of society, have held up the Western ideal that knowledge is power — that, in fact, all knowledge is power.

The price of college first went up in line with its market value. It has continued to rise as it moved from being a crowning achievement to a necessary prerequisite — the ante just to get in the game.

With the cost of a now-necessary college education spiraling upward, people are reasonably concerned with their individual, short-term return on investment. And few are looking at the long-term, cultural costs of this move.

Culture Is a Myth, or If You Prefer, a “Category of Analysis”

Over the past century and a half, as anthropology developed, the field began to understand that we needed a whole lot more objectivity if we were ever really going to understand what culture is.

We’ve spent significant amounts of time trying to develop more objective approaches. And we realized, eventually, that perfect objectivity is impossible in studying humans –because we, too, are humans, and have our own culture.

It all comes back to the original, perhaps unanswerable question:

What Is Culture?

Nuer Boy
Sir E. E. Evans-Pritchard, the famous anthropologist, probably didn’t have one-tenth the idea of what it means to be Nuer as this young boy.

The first, often unspoken truth of anthropology is that other cultures exist. When we look closer at them, however, we find that they don’t! “Culture” is a word that we use to describe things. It’s a tool of analysis.

Back in the old days of anthropology, people believed that cultures were complex but relatively unchanging (and often hierarchical) sets of behavior. Maybe what individuals did changed, but that was often seen to be the effect of outside influence.

Among academics, there was an assumption that outside the West, cultures suffered from some kind of social inertia. Later researchers came to understand that cultures don’t change, thus making people change. It’s quite the other way around. People make choices; cultures don’t. Cultures are an aggregate of behavior, belief, and interaction with the world. In other words:

Culture Is What People Do

On the face of it, already we know that cultures aren’t static systems of unchanging beliefs. We know, from history as well as from observing the world around us, that cultures change.

Modernization and globalization — words that conjure images of cultures “dying out” in the face of outside impact — aren’t strange monsters that attack from the outside. They are culture itself, under the effects of choices and changing technologies.

In other words it’s impossible, on the face of it, to accomplish the most obvious goals of anthropology. We can’t sit down and record what it means to be Diné, Indonesian, or Nuer.

Our lack of ability, as Western academics, to write it all down in ways that are both perfectly descriptive and coherent, comes from two grim realizations:

1. Culture is just a category of analysis.

Sir E. E. Evans-Pritchard (1902 - 1973)
Sir E. E. Evans-Pritchard (1902 – 1973)

Anthropology has a hard time even saying what exactly culture is. How can that be? Culture is what people do, and early understandings of culture approximated the things that are true about “people X” that make them different from us. (“Us,” then, was meant in the most Victorian sense possible: white, middle-class people with access to education.)

Cultures, cultural interaction, subcultures, and culture change are all ways of talking about something that is anything but static. Culture encompasses sets of behavior that are both too complex to understand with any one approach. An ethnography can’t help but take a snapshot of a culture: a single moment, and from a single perspective.

2. Culture, as a label, is polyvocal.

Wait, poly-what? Polyvocal means “many voices.” What it means here is that when we take a label and apply it to a culture, we’re really naming a bunch of related (and often conflicting!) aspects of behavior.

Canada 2010 Winter Olympics OT celebration
Is this all it means to be Canadian?
In a word, no.

It’s a hard idea, but here’s an example. Let’s say that we want to talk about Canadian culture. What does it mean to be Canadian? Beer, hockey, and politeness? Living in Canada? Can English Canadians and French Canadians both be Canadian?

All of those aspects are sometimes true. But there’s no magic formula to determining a person’s Canadian Quotient (CQ). Being Canadian, or being American, or being Nuer, is an experience of identity that goes beyond words, and is some combination of self-identification and social recognition.

In other words, culture is something that we can try to describe, try to understand, and try to make sense of. But in the end, it’s not a list of adjectives. It’s an experience.

The Good, the Bad, and the Whole

Claude Lévi-Strauss
Claude Lévi-Strauss (1908 – 2009)

It struck me today how much we struggle with taking the bad with the good. This isn’t just something that you or I can change by making a decision. It’s something that’s inherent in our view of the world.

We like some things, we dislike others. That is the way of things. And that’s fine, when it’s just you or me. But when we’re looking at whole cultures, it can become something of a problem.

Binary Oppositions

One of the famous, if now outmoded, theories of culture in anthropology is Structuralism. While Structuralism was an older theory, coming from linguistics, it was the work of Claude Levi-Strauss that brought the theory to the forefront.

One of the key ideas that invests itself in Structuralism is that all thought, all concepts, exist not by themselves, but as halves of pairs — a “presence” and an “absence.” These pairs are called binary oppositions. Over and over (the theory argues), we understand the world through pairs.

So, says Structuralism, we only understand good relative to evil. We only understand light relative to its absence, darkness. This is how, they argue, people (and cultures!) understand the world.

But There’s a Hitch

Okay, that’s all fair enough. Even if the theory’s not perfect, it can be a useful tool for understanding some aspects of life and culture. It might not be “the answer” to all things, but it’s an excellent observation.

While cultural theorists might throw it away as imperfect (theorists have a way of doing that), the rest of us can benefit from this observation to understand something more about the world. Sure, it’s not cutting edge; in fact, it’s rather ancient by the standards of social theory.

The rest of us still have something to learn from these binary oppositions. What we can realize is that such ideas as “good” and “bad” or even “like” and “dislike” are insanely complicated. There are physical aspects, psychological aspects, and cultural aspects all tied together in some kind of experience that we call “good.” Or “bad.”

By definition, we want more good, and less bad. If we were 19th century Scottish philosophers, we’d call that “progress.”

But when we say “more good, less bad.” we usually fail to recognize that the good and the bad are often tied together.

Trade Offs

For instance, it’s common knowledge (among anthropologists and other social scientists) that complex societies, like ours, require social stratification. That means, if we want to keep enjoying things like, oh, technology, education, health care, mass transportation, public roads, books, the Internet, money, public safety, and a host of other little “good” things, it’s going to be impossible to win any war on poverty.

So, we can’t just say “poverty bad!” “End poverty!” actually means “End society!” We’ll just have to assume that it’s a shorthand for “make the distance between the top earners and bottom earners less gross.” And sure, that could very well be a good idea.

In fact, I’d argue, the top earners seem to have fallen into a similar trap. “Money good!” “Power good!” they seem to say. Sure, all the money and power in the world feels good. But such a concentration of “good without bad” is probably unsustainable — or more accurately, will be very very expensive to maintain.

The point is that there are lots of things that we ‘don’t like.’ What we find, on examination, is that they’re often trade-offs for things that we like a lot. Most of us don’t like work so much that we’d do it for free, after all. But we “make the trade.”

Taking the Bad with the Good

There are a lot of things we don’t like it the world. When I was about ten, I thought it would be great if we just got rid of taxes. My father patiently explained to me all the things that taxes paid for that I used every day.

But there’s a broader application, and a deeper truth, here: we can’t get rid of the bad aspects of anything just by attacking it. We need to either get rid of the good parts too, or we need to improve the whole system.

If every day were an awesome day, we’d quickly become inured to its awesomeness and have to move on to more awesome things — or start complaining. That’s just human.

We’re people. We divide things into good and bad. That doesn’t usually mean that we get to pick and choose between them.