The Collapse of Western Civilization

Primate with globe
The world is a bigger and more diverse place than Western Civilization ever imagined.

We can joke that this event or that  is a herald of the collapse of Western Civilization, but from an outside perspective, it collapsed some time ago. The “end” of Western Civ happened more or less with the end of World War 2 and the following dismantling of the overt colonialism that had in many ways supported Europe and America since the 1490s.

It was not a loss of power that made Western Culture collapse. It was actually the rise of near-instant communication. It was that we could no longer live in an insulated society, knowing with devout certainty that we had the capital-“T” Truth.

One of the hallmarks of Western Civilization has been its belief that there is “one true way” and that it is, and has always been, best reflected by our own culture. In other words, such recent ideas as multiculturalism, ecumenical religion, and racial harmony fly directly in the face of the traditional Western lessons passed on from generation to generation.

Science and Authority

We discovered, in science, a relatively unbiased truth. Now, we could argue that science isn’t perfect in this, but compared to more authoritarian structures of knowing (including anything heavily philosophically influenced by Neoplatonism or monotheism), it’s downright awesome.

Unlike such single-authority philosophies, science can and does tell us things we don’t want to hear. Science, at its best, teaches us how to be authorities, not just how to submit to them. Of course, that means we can get on the wrong track, sometimes. Mistakes happen. When we take science and use it for questionable ends (take eugenics for example), the scientific method eventually gets us back on track.

But Science Isn’t Culture

What ended Western Civilization was a combination of several factors:

  • a belief that there is one truth for all people, times, and places,
  • that we have an obligation to seek out that truth, and
  • that (most key!) our culture should be coterminous with that truth.

When all these things ran up against the stupendous variety of the world, it shook Western Civilization to its foundations. Western culture still exists and even flourishes, but it would be more accurate to think of it as Western “cultures.” We’ve replaced our assumption of a hierarchical system with multiple systems that cooperate and compete.

We have moved our culture closer to science’s view of the world, and decided that if the world is complex and open-ended, then our culture must be, too. This is an experiment probably just as grand as the one called “democracy.” I look forward to watching it as it unfolds.

The Promise — and Cost — of a College Education

Higher education has changed shape in the past fifty or so years, and the rate of change only seems to be increasing. To paint with a broad brush, we’re moving from a liberal-arts curriculum to something that looks a lot more like vocational training.

There was a time, not so long ago, that a college education was the ticket into a middle-class life, and this is a promise that many people still believe. It is the carrot held out to get students, young and old, to devote years of their lives to study. But at the same time, as the cost of education has increased, and more and more of the U.S. population takes advantage of the educational system, the focus of college seems to be changing.

College as an Investment

The cost of a four-year education is currently north of $150,000, and the savvy shopper is asking about the return on investment for such an expenditure. On the flip side, many careers that used to be vocational now require a 4-year degree just to get in the door. More and more people are going into debt just to get a toe into the job market.

Lightmatter chimp thinker
Is it the power of education, or the prestige of graduating?

While a college education is often required for a career (or even just to get a job), the pay scales have not adjusted for the increased cost of getting in. And more than ever, what we choose to study is looked at not just as a skill set, but as a label, a defining part of who we are.

College has moved more and more toward vocational training. That’s not the colleges’ choice; it’s been driven by market pressures — both the job market itself and the increasing cost of education. Education, as education and not vocational training, is out of favor with many people. Yet education for its own sake is a very middle-class value. The change in education is just one of many examples of how the middle class is shrinking — not just financially, but culturally as well.

Even when schools are touting their ability to teach us how to think effectively, that’s not what many employers are looking for. In the new era of computer-assisted human resources and quick online application for jobs, having the exactly right credential can get your resume looked at.

Social Class and Education are Linked but Not the Same

The three-component theory of stratification, also known as Weberian Stratification (Max Weber 1864 – 1920), says that stratification can be based on wealth, prestige, and power. Educational systems have traditionally been used to grant certain types of prestige. That’s why we call people with Ph.D.s “Doctor” — it’s a way of recognizing their status.

Traditionally, the education system in the U.S. (and elsewhere) has interacted heavily with other prestige systems, including those associated with ascribed status. Or, to put it more bluntly, people were more likely to go further in education if they were rich, white, and male.

That’s changed in recent years, with many previously under-served populations gaining access to college-level education. Problem solved, right? Three troubles have crept up on us in this period of transition.

1) It Was Never Just Education

The first is that while access to higher education has gone up, its value was traditionally a synergy with ascribed statuses. But despite greater and greater access to higher ed, the U.S. still struggles with classism, racism, and sexism.

While our image of the college graduate is of someone with access to the halls of power, educational prestige alone has never been enough to counterbalance the other (less savory) cultural aspects that open doors.  The promise of a college education has been overstated precisely because we’re often unwilling to admit the power that social networks play in short-term job applications and long-term career growth.

2) More Graduates, Fewer Careers

While the number of people who, on paper, “should” have access to the middle class is growing, the actual size of the middle class has been shrinking. That means that more and more people are spending their money to get a chance at a middle-class life, but the number of “winners” is going down. Oddly, the cost of education has not (yet) been falling to mirror this glut in the market for college-educated people.

The continued rise in educational cost is partly because it’s “the only game in town.” We have to pay to play this lottery, because there are few routes to the middle class outside of jumping through the right hoops. Sure, maybe the rare person can skip a few steps through native talent, but that’s not the way to bet.

3) Prestige as a Unified System

In our rush to remove racism and sexism from America’s mind, we’ve managed to deeply institute classism as the defining prestige system — and further, to define class more and more through wealth and access to resources.

This trend (toward using money as a more important measure of a person’s social worth) has led to a deeper institutionalization of class. At the same time, social mobility in America doesn’t live up to our myths about it. In fact, some have argued recently that the new upper class, the 1%, have pulled the ladder up behind themselves.

We sometimes seem to have lost the other prestige systems that used to counterbalance the assumption that “wealth = prestige.” In other words, we need to remember the value of education isn’t only measured in cash at the end of the day. It’s measured in good decisions that benefit the country as a whole.

When we undervalue parallel systems of prestige and rely on money as the only yardstick of status — as exemplified by the idea that the value of an education can be measured by the paycheck it leads to — we’re the ones who end up paying the cost. And that cost is far more than $150,000.

The Appeal of the Other

I’d like to look at three possible explanations for why the “other” is so appealing to some people. There are more possible explanations, but I’m going to focus on just these.

The Biological Reductionist

Gene Simmons
This Is Not the “Gene” You Were Looking For

The biological explanation is so simple, it’s cheesy. The argument is that people are interested in other “cultures” because of a biological, genetic drive to seek out populations that are very dissimilar from their own — to breed with.

Some say the offspring generated by breeding between genetically dissimilar groups are on the average more resilient. Additionally, they may avoid genetic problems endemic to one group or the other.

While arguably true (or partly true) in some cases, this is simple biological reductionism. To claim that world travel is driven by the perceived advantage of genetic diversity is based on the underlying assumption that we’re pre-programmed for behaviors that increase the viability and competitiveness of our offspring. It further comes from the idea that we’re controlled by biological drives.

On the one hand, maybe there’s something to that. But on the other side, the world is a complicated place, and people are complicated as well. This kind of biological reductionism was made popular by the book The Selfish Gene (1976). Like Darwin’s Origin of Species, the ideas of the book have moved from science into popular thought, and are sometimes used to perpetuate a much more cultural selfishness.

In plain words, in the hands of anyone but a geneticist, the idea of the “selfish” gene is probably a justification rather than a reason.

The Social Scientist

Cosmopolitan
Social scientists are a cosmopolitan crowd.

Today, most people live in hierarchical societies. In fact, if you’re reading this, then you have access to the Internet, and I can guess with near certainly that you live in one, too. In these societies, not everyone gets to be on top, or even comfortable where they are. Sometimes, those who travel out to other cultures might be looking for places they’ll fit better.

In order to understand this one, we need to think about how population diversity interacts with cultural diversity. Imagine you’ve got a talent for something — say numbers, for example. You could stay in your hometown in Nowhere, Illinois and be the local math teacher. Or you might move to New York City and work as a business analyst on Wall Street. One way, you’ll have status as a teacher, and the other you’ll have access to more resources — you’ll almost certainly get paid more.

In this era of instant communication and near-instant travel, there’s not much difference between moving to New York and moving to Tokyo or Seoul — except for the paperwork and the language. Especially in the American middle class, moving for work is an accepted tradition, and moving around the world doesn’t have the impact it once did. Maybe we just seek out the “other” in a search for opportunities that fit our skills.

The Cultural Afficianado

Palestinian Cultural Mural Honoring Dr. Edward Said
Sometimes, what we see in other cultures has more to do with our own desires and wants. Thank you, Dr. Said, for writing Orientalism (1978).

There’s a third possible reason for striking out to explore the “other.” This one is much less “deep” than the two previous explanations. It could be as simple as believing, as part of your own culture, that certain other cultures are “cool.” Travel could be a way of rising in status in your own culture, too.

Someone’s motivation to know the “other” could be as simple as their own culture putting a value on certain other cultures. In the past, Korean scholars and monks would go to Imperial China to study, as education found there would raise their status.

Closer to home, I studied about Japan in the ’80s, when its economic power made it of interest to the U.S. It never even occurred to me at the time why I was interested. But it wasn’t just me. This is when we started importing such household words as katana, ninja, and sushi.

There is a similar interest today in both China and Korea, as they come to wield greater economic power. Ideas associated with East Asian culture happen to be hot property at the moment. This probably has a lot to do with that region’s growing economic power and influence.  Have you ever watched Boys Over Flowers? Been to an acupuncturist? Heck, have you ever played Dynasty Warriors? Or watched Mulan?

I know that many of my readers have experience living in a number of countries, so please comment below with your responses, explanations, and amusing anecdotes!

Rationality Isn’t Rational

Chimpanzee - Leipzig zoo
“Hey, who’re you calling irrational?”

As I’ve discussed before in this blog, our Western image of humans as rational is very much culturally bound. It’s strange, really, that calling someone “irrational” is such a slur. Logic itself, held so widely as a source of strength, shudders under the weight of our everyday lives. Despite our assumptions, values, and best intentions, we act irrationally (without reason) much of the time in daily life.

Why do we say, “Bless you” when someone sneezes? Is the devil really going to fly up their nose? Why do we shake hands? Are we really making certain they’re unarmed?

Are these really “rational” behaviors? No, they’re social rituals, cultural artifacts that don’t make “rational” sense, but have meaning nonetheless. Linguistically, they have pragmatic meaning. Culturally, they have ritual meaning.

When we’re doing what we think are rational things, it’s not usually rationality or logic that is informing our actions. It’s culture. Culture is filled with little rituals that make all the difference in terms of social interaction.

Don’t get me wrong; just because something’s “ritual” that doesn’t make it “untrue.” Rituals — in an anthropological sense — are social conventions, and they allow us to interact by giving us patterns of action that are not only predictable, but allow us to routinize behaviors. Imagine driving on the road if we didn’t all agree that we should drive on the right (or left!). It would work, but it would be inefficient and dangerous.

When we say that someone’s acting “irrationally,” it doesn’t usually mean “illogically” so much as “not in a way that we expect.” That doesn’t make it any less strange, but we might as well call it what it is. “Irrationality” isn’t a logical judgement — it’s a social judgement. We’re placing someone’s behavior against a matrix of cultural expectations and finding it wanting.

In other words, we’re not using “logic” as a process of thinking. We’re using it as a social value. Can logic and rationality really be a dispassionate process and a cultural value at the same time? That’s a tough question, and the answer is probably “no.”

Logic — What It Is, and What It Isn’t

Vulcan CosPlayer
Logic does not transcend cultures.
(CosPlay is another matter.)

If logic were truly a purely rational way of thinking, then it would be spectacularly powerful, and people would study philosophy as avidly and pragmatically as they study computer science or accounting.

Don’t laugh! There was a day, not too long ago, when this was true. When we lived our lives (mostly) in one culture, the ability to think “logically” was much more useful. When our everyday world was less “messy” and overarching values were hierarchically determined, “logic” had more meaning.

The problem with logic is the same problem that exists for all modes of thought. Logic is subject to the GIGO (garbage in, garbage out) principle. Even if we think logically about things,  many of our basic perceptions of the world are based on irrational assumptions and basic drives.

  • I might be rational about buying a car, but that doesn’t mean I’ve rationally examined my need for one.
  • And even if I do, that doesn’t mean I’ve rationally examined all aspects of its impact.
  • And if I decide I don’t need one, then I’ve decided that the environment is more important than any benefit I might derive.

Logic, by itself, doesn’t get us much of anywhere. What’s worse, anthropology — which allows us to look at the world cross-culturally — tells us that pretty much everything we think about is culturally bound.

Contrary to popular belief, logic and rationality don’t transcend culture. Because our assumptions about the world are culturally bound, anything we apply logic to is necessarily bound by these same constraints.

That doesn’t make us helpless, and it doesn’t make logic useless. We can work against our constraints and try to get beyond them. Logic is incredibly useful for making certain that our thoughts all agree with one another. Logic and rationality are powerful tools.

Rationality Isn’t Rational (But It’s Still Pragmatic)

If our cultural beliefs about logic and rationality aren’t everything they’re cracked up to be, then what’s the point of studying? Just about every discipline is about learning to think clearly about certain matters.

Mathematicians learn to think clearly about numbers and data. Chemists learn how atoms, molecules, and compounds act and interact. Lawyers learn to think about law and precedence. Anthropologists learn to think clearly about culture.

Die Chemiker
Understanding the whole of the world is too much for any one person.

Every field of study trains the mind to think in new ways. Being a chemist isn’t a matter of memorizing the periodic table and then going on to learn more and more details of the chemical world through rote memorization. It’s about learning whole new ways of thinking — of forcing neurons to line up and march together to get certain results.

These areas of study are called “disciplines” for a reason. That’s what they are: disciplined ways of thinking about certain aspects of the world.

We might think marketing is crazy, but marketers think far more rationally about the buying and selling habits of people than mathematicians ever will — or they’d be very bad at their jobs. The behaviors that marketers study and try to modify aren’t rational, but that’s a different matter.

People in general aren’t rational about buying and selling. But the people who do are called “investors.” Investing is a discipline with its own rules and assumptions about the world, with its own rationality. Those who study what investors do are called “economists.”

What we can learn from this is that there is no one “rationality” — rationality isn’t just one thing. The world is far too large and diverse for any one of us to know things with perfect rationality, but that doesn’t mean we shouldn’t study it.

It does, however, mean that we should be wary of making purely “rational” decisions. And doubly wary when we’re convinced that we’ve managed to nail down even a piece of the truth.

Happy Father’s Day

"I pappas famn" by Severin Nilson
Happy Father’s Day, everyone!

For human primates, the two instinctive markers of status are age and gender. Even when humans lived in bands, or small kinship groups, and didn’t have any role specialization, they still took into account age and gender as markers of status.

For an anthropologist, Father’s Day is a special day, since it is when our own culture recognizes the very people whose status is marked specifically by age and gender. Kinship is a powerful organizational model, and one that we often undervalue in our own cultural myths.

We amuse ourselves of telling stories about the harried father, the rebellious child, and so forth. But today is the day that we recognize the role’s importance instead of challenging it. We take a moment to focus not on the drama, but on the years of (often dreary) hard work.

Let’s not forget that kinship is an organizing principle for all humans. Be sure to wish that someone special a “Happy Father’s Day!”

Facebook, Meetings, and Social Grooming

The Nomads - from Simple Life
Social relations and social capital need to be maintained.

How many people can you connect with? According to the British anthropologist Robin Dunbar, the total number of people you can maintain stable social relationships with varies from about a 100 to 230.

We might brag that we have 400, 500, or even 1,000 Facebook “friends” but as we go past that magic number (usually around 150) the quality of connections necessarily degrades. When we get into these higher numbers, we’ve changed the meaning of the word “friend.”

Friendship Has Limits

Dunbar suggests that this cognitive limit is based the size of the human neo-cortex. This limit places a boundary on the basic hunter-gatherer unit that has been the backbone of social relations for most of our time since we became human — the band.

Hunter-gatherer's camp at Irish National Heritage Park
Humans didn’t always live in cities of millions.

Bands were, and are, usually grouped at about 30-50 individuals, though they might range higher in situations when people need to work together, such as in periods of high environmental pressure. These groups were (and are) most often linked and organized through kinship relations.

Neolithic farming villages were more likely than bands to touch at the limit of Dunbar’s number. As it turns out, groups of 150 people need to spend a fair amount of time working on just staying together. Those groups that expand beyond these numbers seem to break apart under their own social weight. For those looking to cross that line, the keywords are “hierarchy” and “delegation.”

Robin Dunbar, who proposed the number, speculated that in a group of about 150, approximately 42% of the group’s time would need to be spent on social activities that would keep it bonded. The higher the number, the more time needs to be spent maintaining group cohesion. That makes sense, right?

Meetings are Social Grooming

Plenary_session_,_at_the_2005_Horasis_Global_China_Business_Meeting,_with_Fu_Chengyu,_President,_CNOOC,_Xie_Qihua,_Chairwomen,_Baosteel,_Guo_Wei,_President,_Digital_China,_Zhao_Xizheng,_Chairman,_China_Electricit
Meetings are work. Social cohesion is as critical to a company as any business objective.

Dunbar’s number is of interest because it explains a couple of features of modern life that otherwise seem rather mysterious. How many times have we sat in departmental meetings and wondered if they’re needed. “Nothing seems to ever get done,” we complain. But without those meetings, even less gets done.

Dunbar’s number gives us a clue why, in any large organization, we need to spend so much time in those meetings instead of doing “productive” work. The meetings are productive work — they help a bunch of fractious primates all keep moving in the same direction.

Our idea of social activity as something that is “not productive” is probably based on faulty models of human behavior and interaction. If we step back and realize just how foreign working in an office and doing a repetitive mental job is for your average primate, then we can get a grasp on the idea that meetings are necessary. Meetings have to be there to reinforce what amount to a series of “unnatural” behaviors–behaviors that run counter to our biological heritage as hunter-gatherers.

Social (Grooming) Media

Social media -- Emerging technologies
How to engage in social grooming from the comfort of your own office.

Meetings that see to eat our time have long been a complaint. But there’s a new time eater: social media. Dunbar’s number can help us understand why we “need” to spend so much time on Facebook and Twitter (and, I suppose, WordPress). When we’re working in groups that are at or beyond the cognitive limits of Dunbar’s number, we need to spend significant amounts of time on our “social grooming” behaviors.

Social grooming doesn’t mean picking the lice out of our friends’ hair, though it does have the same effect. Communicating back and forth, sharing information that makes us seem more human, sending messages that don’t inform so much as share: these are all activities that build familiarity and trust. They let us all reinforce our membership in groups, as well as our positions relative to other members.

On the one hand, “small talk” isn’t some terrible waste of time. It’s our way of connecting with other people, forming a basis for effective communication, and building social capital. Social connections take time, but that doesn’t make them a waste of time.

On the other hand, if you’re looking to increase your “productive” work then you might examine the size of the social network you’re maintaining. You might want to ask yourself who’s really in your band?

The Useful Myth of Progress

During an intro to cultural anthropology class, I mentioned (I thought offhandedly) that progress itself is a cultural myth. One of the classes’ best students looked startled and concerned, and raised his hand. “No, it’s not.” he said firmly, and with complete conviction.

It is an incredibly difficult thing for people of any culture to have their most cherished values challenged. And progress is one of the key values of Western culture.

Things Just Keep Getting Better

For a moment, let’s think of Western culture as a book. That book, like books of any genre, has an underlying theme that tells us a lot about who the winners will be, who the losers will be, and what it all means.

For modern Western Culture, the theme could well be “progress.”

The Unknown by John Charles Dollman
“Progress” is the value that allows us to accept that the future is unknown and that new ways of living must be found.

“Progress” is the idea that things just keep getting better and better. It has several corollaries, such as the idea that all “improvements” are somehow inherently good, and that greater efficiency is somehow a good in and of itself.

The schema of “progress” mixes together several ideas in what amounts to a value judgement. It takes the idea of complexity, and attaches it to “the good.” We believe that the complex is somehow better than the simple.

Why Progress?

There’s a good reason that we value progress. The last 10,000 years or more have seen a steady increase in the human population and its population density. Throughout that time, change has been a constant.

From the time of the first cities (around 7,500 BCE), after the advent of farming and the Neolithic revolution, humans have lived in higher population densities than ever before.

Despite being social animals, we primates aren’t given to getting along in large numbers. We have had to develop cultural systems to manage the problems of putting so many territorial primates together. The developments in culture that came with urbanization — stratification, rule of law, etc. — required, and allowed for, ever greater cultural complexity.

With the ever-increasing human population, increased innovation, including complexity in technology and other aspects of culture, becomes necessary to maintain relative peace.

Mythologizing Progress

Painting of Eridu, one of the first cities
Progress supports increased population, which in turn requires new methods of organization and management.

As population density increases, populations have to spend more and more effort not only on resource management, but also on some kind of cultural “research and development.” Otherwise, conflict breaks out as competing groups let loose under the increased pressure. Ever-increasing population density means ever-increasing complexity in human methods of management.

Even such recent inventions as the Internet, seen through this lens, are tools for allowing increasing numbers of primates to somehow manage in a changing and increasingly stressful (in a primate sense) world.

“Progress” is required to maintain some kind of counterbalance to increasing biological pressures. More than that, our culture values “progress” as an idea. We invest in it. That investment is what allows us to maintain and develop the complex cultural systems that keep us relatively fed, healthy, and safe.

The idea that “progress is always good” is tied to the underlying assumption that human population density will always increase. As long as the population keeps going up, progress is a necessity.

Progress — increased complexity and efficiency — is necessary, but that doesn’t make it perfect. The ways that we understand progress are cultural. “Progress is good” is a statement of values, values shaped by specific realities and tied to the underlying assumptions of Western culture. Our ideas of progress determine winners and losers just as much as any war.

“Progress” is one of our bedrock assumptions about the world, and it is a needed one in these times. Yet despite its utility, it is cultural, an idea. Increased complexity is necessary, but that doesn’t mean everything labeled “progress” is inherently good.