Category Archives: Racism

Race: Cultural Construct — And Very, Very Real

Anyone who has absorbed the content of an Anthropology 101 class, or even a general education class, knows intellectually that race is a fiction. What we don’t usually take away from these classes is the nature of “fiction.”

We think of “fiction” in the most simple, 6th grade terms — as “made up.” But as adults, it is easier to recognize that that a fiction is not the same as a lie — or even more simplistically “a random collection of disparate ideas.”

10,000 Monkeys

Better than 10,000 monkeys

But Shakespeare was not written by 10,000 monkeys on typewriters. Fiction is not random. In the same way, race does exist — as a part of culture.

The part of culture that we struggle with, especially our own culture, is that it is not coterminous with reality. But one thing that we know, as anthropologists, is that culture seems real. From the inside (and humans must by definition operate from the inside) we can’t tell the difference between reality and our perceptions of it.

The “crime” is not that we believe in race, but that we legislated it as if we bear the unitary truth of reality. The mistake, then, is not that our culture is “wrong” but that our expectations of culture are crazy.

A Teaching Moment

Saint Francis of Assisi by Jusepe de Ribera
Just another hotblooded Italian? No, that’s Saint Francis of Assisi.

When I was teaching Anthropology 101, I had a student in one of my classes who felt, very strongly, that his identity as an Italian-American was all the explanation that was necessary for his hot temper. No amount of argument would suffice to show him that what he was doing was embodying and reinforcing his culture’s expectations.

The discussion in class became quite heated, as he would not budge from the belief that his experience was not only valid, but indeed intrinsic. That, of course, is the power of culture.

The power of anthropology is not that we can step outside of culture and look at it. That is, I believe, impossible. We can bend and expand our culture, stretching our mind to extrapolate the nature of culture. But we can never wholly step away from it.

Anthropology: Proscriptive or Descriptive?

Anthropology is an academic discipline, and philosophically a powerful one. It allows us, from the inside, to stretch and turn our culture so that it can look at itself.

But we also know that it is possible, once we have begun these philosophical limbering exercises, to reshape not only ourselves, but culture as a whole. Like all science, anthropology lets us begin to reshape our minds in ways that let us begin to map the world the way it is.

But this idea, that we can reshape our culture to map the world, is based on the underlying cultural belief that culture and reality are the same thing. Believing that everyone should have an understanding of the world that matches an anthropologist is just as self-centered as atheists (for an example in the news) who believe that everyone should see the world the way they do.

All at once, it brings anthropology to the center and cheapens it. If anthropology is itself a discipline, then it has something to offer the world as it is. We don’t need everyone to be an anthropologist any more than we need everyone to be a doctor.

Instead, anthropology might consider focusing on what these truths we learn mean when we can present them without leading the listener to be an anthropologist themselves.

And if it is impossible for us to teach without leading the listener through every step of the whole process — if we can’t bring truth back for our culture — then that would explain the challenges we face as a discipline.

The difference lies in status. If anthropologists can access cultural truths and these truths are of some use, then we must teach from a rhetorical standpoint of ethos. If our only way of teaching is teaching others how to think, then we are in the business of changing culture, not serving it.

Race is part of Western Culture, and denying its cultural validity is an uphill battle. Instead, we must reach out to those who legislate and make decisions that cross cultural groups. These are the people who need to understand that everything they “know” about “the other” is from only one perspective.


It’s Beginning to Look a Lot like Christmas

The War on Christmas is a shorthand for one front in the West’s “culture wars,” which address a number of cultural bones of contention. The central question of this war could be framed “Is the West primarily Christian?”

If we think about Western Culture as the thread of belief and knowledge that has come down to us through history, stretching from the Greeks and Romans (and the Ancient Egyptians, though they got short shrift in my middle school history books, and probably yours), then the West is Christian.

Historically, the Christians fought a battle for political supremacy in the first handful of centuries CE. They took over the Roman Empire and set the religious and scholarly tone for our culture. In short, they won.

The victory of the Roman Christians for political influence meant that power resided either in the hands of the Church (or one of the divisions that came after the Protestant Reformation), or in the hands of people at least nominally beholden to the Church. That lasted, in many ways, until the end of the European colonial era.

The West isn’t the West Anymore

Since the end of World War 2, however, something else — something critical — happened to the West. It stopped being the home only of Westerners.

Starting with the Age of Discovery, which began around 1492, there was increased trade and influence passing back and forth between the West and the rest of the world. For Europeans, the world got a lot “larger” as everyday people came into contact with thoughts, ideas, and people from around the globe.

The European colonial era came to a dwindling end around the close of World War 2. What arose in its place was a postmodern world, where the flow of people and ideas became more and more rapid, and where people of non-European descent could “take their place at the table.”

Human Rights Are for (All) Humans

Sure, there were many non-Westerners in America before the mid 1900s. But those groups lacked many fundamental rights. They were not, legally or culturally, seen as equals. They lacked many of the freedoms enjoyed by nominal Westerners.

For instance, in many cases, non-Westerners in America could not vote, have full citizenship, own land, or marry as they chose. More important for consideration of Christmas, their religions were not recognized.

However, since the second half of the 20th century, a new spirit of religious freedom has enveloped America. It’s an era when the ideas of human rights, which had historically been only applied to white Westerners, were applied more evenly. Non-Western, and non-mainstream Western, populations now have an expectation that the separation of Church and State means something.

The War on Christmas

So it comes down to this: the “War on Christmas” is actually a push to make human universal rights, well, universal. It isn’t a silly argument, though. It’s a full-blown struggle to define “the West.”

Is Western culture just the Christian descendents of an ancient line of thought? Or is it more beholden to these new ideas of equal freedoms for all?

Star Wars Isn’t Racist

Today I went to the San Jose Tech Museum, where I saw the Star Wars exhibit on its last stop of its tour. While it was a very cool exhibit, one thing that I (anthropologically trained) really noticed was the treatment of the “other” in Star Wars, the mythology of my generation.

Science Fiction or Fantasy?

Storm Troopers on Patrol
Storm Troopers on patrol at the Tech Museum of Innovation — San Jose, CA

To understand the impact of Star Wars on our impressionable young minds, we first need to dispense with the idea that the franchise is science fiction at all. Sure, they have laser blasters, robots, radios, and space ships, but it’s really fantasy (…in space!).

How do we know it’s fantasy? Because against the fairly dystopian cyberpunk background, we’ve got good guys in white, bad guys in black, glowing swords, and magic powers blooming left and right. And let me say this clearly: if we have good and evil wizards battling it out in the distant past, it’s not some vision of our future.

Science fiction, as a genre, is one of the ways that our culture tries to make sense of rapid cultural and technological change. We’ve been experiencing this change since the beginning of the scientific era and industrial revolution. But that’s not Star Wars.

Star Wars is about our colonial past, the nature of good and evil, the proper role of mysticism, and necessity of righteous rebellion against tyranny. In other words, it’s not a projection of our future, but a mythic retelling of our own past.

The Mythic Past as a Window to Our Own Past

The Millenium Falcon
The coolest cat in the universe clearly needs the coolest car…spaceship…whatever.

Back in the 70s, when science was “gonna change the world,” Star Wars gave us a chance to see something that was quite the opposite of the Humanist and spiritually sterile Star Trek. But the differences don’t end there. While Star Trek treats all aliens as foreigners with their own political interests and idiosyncrasies, Star Wars treats them as archetypes of our own culture.

In other words, George Lucas might not be racist at all. But he’s betting the bank that we are. And, historically speaking, that’s a safe bet to make.

Only a hundred years ago, the British Empire ruled a big chunk of the world, “race” was a dominant political-economic creed, and everyone “knew” that religion was true and culture was an a priori category. We’ve come a long way in changing our views about matters of race, religion, and gender. Heck, we’ve come a long way since the 70s.

Defining the Problem

Tusken Raider
Extremely territorial and xenophobic, Tusken Raiders will attack with very little provocation.” Little provocation after the humans took their planet, that is…

We live in a culture that would be unrecognizable to the people of a century ago. But that doesn’t mean the change is over. We still understand these archetypes, use them in our stories, and to a certain extent keep them as part of our culture.

Just as science fiction plays with the cultural change wrought by technology, Star Wars as science fantasy deals with our own changing culture. The aliens aren’t really meant to be aliens, just people in funny suits with motivations that we can quickly apprehend and use to drive the plot forward.

You know, archetypes. Stereotypes. We can get all jumpy about the way Star Wars uses these cultural shorthands to paint a quick, recognizable picture. But the problem doesn’t lie with the authors. The issue isn’t that Star Wars is racist. The problem is that we are.

Culture Is a Myth, or If You Prefer, a “Category of Analysis”

Over the past century and a half, as anthropology developed, the field began to understand that we needed a whole lot more objectivity if we were ever really going to understand what culture is.

We’ve spent significant amounts of time trying to develop more objective approaches. And we realized, eventually, that perfect objectivity is impossible in studying humans –because we, too, are humans, and have our own culture.

It all comes back to the original, perhaps unanswerable question:

What Is Culture?

Nuer Boy
Sir E. E. Evans-Pritchard, the famous anthropologist, probably didn’t have one-tenth the idea of what it means to be Nuer as this young boy.

The first, often unspoken truth of anthropology is that other cultures exist. When we look closer at them, however, we find that they don’t! “Culture” is a word that we use to describe things. It’s a tool of analysis.

Back in the old days of anthropology, people believed that cultures were complex but relatively unchanging (and often hierarchical) sets of behavior. Maybe what individuals did changed, but that was often seen to be the effect of outside influence.

Among academics, there was an assumption that outside the West, cultures suffered from some kind of social inertia. Later researchers came to understand that cultures don’t change, thus making people change. It’s quite the other way around. People make choices; cultures don’t. Cultures are an aggregate of behavior, belief, and interaction with the world. In other words:

Culture Is What People Do

On the face of it, already we know that cultures aren’t static systems of unchanging beliefs. We know, from history as well as from observing the world around us, that cultures change.

Modernization and globalization — words that conjure images of cultures “dying out” in the face of outside impact — aren’t strange monsters that attack from the outside. They are culture itself, under the effects of choices and changing technologies.

In other words it’s impossible, on the face of it, to accomplish the most obvious goals of anthropology. We can’t sit down and record what it means to be Diné, Indonesian, or Nuer.

Our lack of ability, as Western academics, to write it all down in ways that are both perfectly descriptive and coherent, comes from two grim realizations:

1. Culture is just a category of analysis.

Sir E. E. Evans-Pritchard (1902 - 1973)
Sir E. E. Evans-Pritchard (1902 – 1973)

Anthropology has a hard time even saying what exactly culture is. How can that be? Culture is what people do, and early understandings of culture approximated the things that are true about “people X” that make them different from us. (“Us,” then, was meant in the most Victorian sense possible: white, middle-class people with access to education.)

Cultures, cultural interaction, subcultures, and culture change are all ways of talking about something that is anything but static. Culture encompasses sets of behavior that are both too complex to understand with any one approach. An ethnography can’t help but take a snapshot of a culture: a single moment, and from a single perspective.

2. Culture, as a label, is polyvocal.

Wait, poly-what? Polyvocal means “many voices.” What it means here is that when we take a label and apply it to a culture, we’re really naming a bunch of related (and often conflicting!) aspects of behavior.

Canada 2010 Winter Olympics OT celebration
Is this all it means to be Canadian?
In a word, no.

It’s a hard idea, but here’s an example. Let’s say that we want to talk about Canadian culture. What does it mean to be Canadian? Beer, hockey, and politeness? Living in Canada? Can English Canadians and French Canadians both be Canadian?

All of those aspects are sometimes true. But there’s no magic formula to determining a person’s Canadian Quotient (CQ). Being Canadian, or being American, or being Nuer, is an experience of identity that goes beyond words, and is some combination of self-identification and social recognition.

In other words, culture is something that we can try to describe, try to understand, and try to make sense of. But in the end, it’s not a list of adjectives. It’s an experience.

Belief, Knowledge, and Culture

Carl Spitzweg's The Alchemist
Why are those who seek knowledge such romantic figures?

I’d like to return briefly to the topic of science, and explore further the difference between “science” as in “what researchers do” and Science (with a capital “S”) as in “what researchers have told us.” The first is a way that we learn, gather data, and test it. The second is the collective wisdom of a certain part of our culture.

What I’m saying is that science is two different things in culture. There’s “science is what scientists do” and “science is what scientists say.” The first category helps build a trove of knowledge that is like nothing the world has seen before. The second category is built from the first, but it has a level of cultural interpretation built into it.

An Example

Science, unlike some other forms of knowledge, isn’t designed to tell us how to guide our lives. For example, one might argue from a genetic perspective that the whole purpose of life is based around perpetuating one’s own genes effectively. That makes sense, right? It’s a useful tool of analysis.

However, living with genetics as the sole guiding principle of one’s life misses two things:

  1. everyone kinda knows this already — it’s why there’s so much sex on TV, and
  2. people are not rational, culture-less beings who live their lives according to decisions they make.

Remember, we’re primates, right? Anyone who truly and wholly tried to live by these genetic assumptions, while holding onto Western notions of culture, ethnicity, and the individual, would be setting themselves up to emulate racist, sex-obsessed sociopaths.

While science might analyze our actions based on genetic closeness between individuals, that doesn’t mean we should use such data proscriptively, to guide our decisions.

But Scientists Say…

Most people in Western culture are not scientists. According to InsideHigherEd, only 17% of college graduates earned their bachelor’s in STEM topics. And that’s only the percentage of the total group who attended college. For graduate degrees, the number drops to 13%.

And even if we look at the whole group of graduates as “scientists” (and, admittedly, if we only regard STEM topics as “science” — a debatable question right there), it’s hardly a majority of Americans. Most people are just getting by, not engaging in cosmological investigations at any level.

Even more, since scientists are usually specialists in one field or another, a whole lot of scientific knowledge is being propounded by people who have not done the research for themselves. Their belief rests on their acceptance of science’s authority. We don’t test gravity, we take their word for it.

Science: Rhetoric and Dialectic

In other words, most science is taught, used, and discussed outside of research circles; it’s not being shared by field research specialists. That transforms it from science — a way of learning new things and exploring the world — to Science — a way of viewing the world.

Is this a problem? Does that make the proponents of Science somehow wrong? The short answer to that question is “no.”

The longer answer isn’t “yes.” It’s closer to, “oh, now I can see the influence of Western culture on Science.”

Why? Because as soon as scientific knowledge moves from the realm of dialectic (discussion between scientists) to the realm of rhetoric (discussion between scientists and non-scientists), it lacks the surety that comes from the receiver being able to test the assumptions for themselves.

The rhetorical discussion of scientists with non-scientists is unlike the dialectical one between scientists. The point of rhetorical discussion is winning and strengthening social status, rather than synthesizing a more complete truth. Or to put it more simply:

Participants in a dialectical argument are, on a deeper level, working together. Whereas those engaged in rhetorical debate are much more likely to be working at cross purposes.

But to come back to the key point, the scientific method, and associated research activities — science with a small “s” — is (or should be) distinct from Science with a capital “S.”

The corpus of all the knowledge that humanity has learned, mixed in with a number of cultural beliefs that are accepted on Scientific authority, is what we call Science.When we add the big “S,” Science reflects our cultural values.

It is Science, not science, that argues there is no God, or that research will solve all the world’s ills, or that drinking eight glasses of water a day will protect you from harm.

The Promise — and Cost — of a College Education

Higher education has changed shape in the past fifty or so years, and the rate of change only seems to be increasing. To paint with a broad brush, we’re moving from a liberal-arts curriculum to something that looks a lot more like vocational training.

There was a time, not so long ago, that a college education was the ticket into a middle-class life, and this is a promise that many people still believe. It is the carrot held out to get students, young and old, to devote years of their lives to study. But at the same time, as the cost of education has increased, and more and more of the U.S. population takes advantage of the educational system, the focus of college seems to be changing.

College as an Investment

The cost of a four-year education is currently north of $150,000, and the savvy shopper is asking about the return on investment for such an expenditure. On the flip side, many careers that used to be vocational now require a 4-year degree just to get in the door. More and more people are going into debt just to get a toe into the job market.

Lightmatter chimp thinker
Is it the power of education, or the prestige of graduating?

While a college education is often required for a career (or even just to get a job), the pay scales have not adjusted for the increased cost of getting in. And more than ever, what we choose to study is looked at not just as a skill set, but as a label, a defining part of who we are.

College has moved more and more toward vocational training. That’s not the colleges’ choice; it’s been driven by market pressures — both the job market itself and the increasing cost of education. Education, as education and not vocational training, is out of favor with many people. Yet education for its own sake is a very middle-class value. The change in education is just one of many examples of how the middle class is shrinking — not just financially, but culturally as well.

Even when schools are touting their ability to teach us how to think effectively, that’s not what many employers are looking for. In the new era of computer-assisted human resources and quick online application for jobs, having the exactly right credential can get your resume looked at.

Social Class and Education are Linked but Not the Same

The three-component theory of stratification, also known as Weberian Stratification (Max Weber 1864 – 1920), says that stratification can be based on wealth, prestige, and power. Educational systems have traditionally been used to grant certain types of prestige. That’s why we call people with Ph.D.s “Doctor” — it’s a way of recognizing their status.

Traditionally, the education system in the U.S. (and elsewhere) has interacted heavily with other prestige systems, including those associated with ascribed status. Or, to put it more bluntly, people were more likely to go further in education if they were rich, white, and male.

That’s changed in recent years, with many previously under-served populations gaining access to college-level education. Problem solved, right? Three troubles have crept up on us in this period of transition.

1) It Was Never Just Education

The first is that while access to higher education has gone up, its value was traditionally a synergy with ascribed statuses. But despite greater and greater access to higher ed, the U.S. still struggles with classism, racism, and sexism.

While our image of the college graduate is of someone with access to the halls of power, educational prestige alone has never been enough to counterbalance the other (less savory) cultural aspects that open doors.  The promise of a college education has been overstated precisely because we’re often unwilling to admit the power that social networks play in short-term job applications and long-term career growth.

2) More Graduates, Fewer Careers

While the number of people who, on paper, “should” have access to the middle class is growing, the actual size of the middle class has been shrinking. That means that more and more people are spending their money to get a chance at a middle-class life, but the number of “winners” is going down. Oddly, the cost of education has not (yet) been falling to mirror this glut in the market for college-educated people.

The continued rise in educational cost is partly because it’s “the only game in town.” We have to pay to play this lottery, because there are few routes to the middle class outside of jumping through the right hoops. Sure, maybe the rare person can skip a few steps through native talent, but that’s not the way to bet.

3) Prestige as a Unified System

In our rush to remove racism and sexism from America’s mind, we’ve managed to deeply institute classism as the defining prestige system — and further, to define class more and more through wealth and access to resources.

This trend (toward using money as a more important measure of a person’s social worth) has led to a deeper institutionalization of class. At the same time, social mobility in America doesn’t live up to our myths about it. In fact, some have argued recently that the new upper class, the 1%, have pulled the ladder up behind themselves.

We sometimes seem to have lost the other prestige systems that used to counterbalance the assumption that “wealth = prestige.” In other words, we need to remember the value of education isn’t only measured in cash at the end of the day. It’s measured in good decisions that benefit the country as a whole.

When we undervalue parallel systems of prestige and rely on money as the only yardstick of status — as exemplified by the idea that the value of an education can be measured by the paycheck it leads to — we’re the ones who end up paying the cost. And that cost is far more than $150,000.

Anthropology–Veni, Vidi, Intellexi

Anthropology is the study of human cultures, but it’s also the study of the “other.”  The “other” used to be pretty hard to find, when most people tried to live within homogeneous groups and travel was dangerous and expensive. Seeking out the “other,” people of other cultures, was something that few people did. But that isn’t nearly as true as it used to be. Now, we see difference just by walking down the street, turning on the TV, or going to work and school.

Seeking the “Other”

Sir James Abbott by B. Baldwin
General Sir James Abbott dressed as an Indian noble

Anthropology was born of the colonial era. The people of Europe, and of European descent, ruled much of the world in far-flung empires. It was more than a cute saying that, “the sun never sets on the British Empire.” With these extensive empires came colonial rule, and colonial rule required an understanding the people who were being ruled.

For much of the colonial era, Christianity and racism shaped the government views and policies regarding the people being ruled. But with the rise of science as a way of understanding the world, and Christianity’s loss of complete authority, science had to develop ways of addressing the questions that plagued colonial decision making.

While political science tries to answer such questions as “what is the relationship between the ruler and the ruled?”, anthropology seeks to answer more basic questions, such as “what is the nature of those who belong to other cultures?” We can now say with certainty that culture rules these differences. Yet such knowledge was not always self-evident; it is at least partly a product of the scientific work of anthropology.

Anthropology’s Relevance

Abolition de l'esclavage (1849)
Proclamation of the Abolition of Slavery in the French Colonies, 23rd April 1848

In order to understand the value of anthropology, we need to take a long, hard look at our own culture and its history. We might wave off such ideas as racism and eugenics as old-fashioned, out-of-date, and downright ignorant and evil. But this ignores a couple of basic facts.

First, the history of racism in Western thought goes back pretty far, and is planted pretty deeply. We might be trying to tear it out by the roots, but those roots are intertwined with other ideas. We might note that “race” is tied to social stratification, but we still enjoy a movie where the “noble savage” works alongside the good guy in a lesser role. But he’s a good guy, so that’s okay. Right, Tonto?

Second, many aspects of Western culture are still racist. It is true that, within academia, racist ideas have lost their shine. In fact, being called a racist is pretty damning. But when we talk about the everyday knowledge of Western culture, then we can see that racism is not yet vanquished. It is possible to argue that it never will be, and that racism itself is a part of Western culture–a temptation to simplify the world that must be struggled against.

The Only Game in Town

Florida Governor Rick Scott (official portrait)
Florida Governor Rick Scott says anthropology is a waste of tax dollars.

To paint anthropology’s detractors with a broad brush (and I recognize that anthropology maybe should be very careful doing that), many of the critiques today come from people who are unwilling to look for meaning and value outside their own culture. Anthropology studies something that many Americans don’t really believe exists–other cultures.

Sometimes it seems like studying other cultures is seen as just one step away from studying parapsychology. Maybe people think they saw the “other” once, and it makes a cool story. But it’s not something for polite conversation.

But at the same time, we’re buffeted on a daily basis by the “other.” Every day we deal with the “other” and, in a democracy, make political decisions and commentary on how we should deal with it.

Today, people from other cultures have significant, daily influence on our lives. The West engages in wars around the world, invests in businesses in far-flung regions, and imports goods made by other cultures who understand our needs all too well.

Further, we live side-by-side with the “other.” Anthropology is no longer a science devoted to exotic places and ways of thought. It is much more basic: a way of understanding our culture’s place in the world. This knowledge isn’t just relevant–it’s critical. It’s a big world out there.