Investing in Education

The promise of a college education has long been entrance to the middle class. What we’re learning however, in a painful and indelicate manner, is that not everyone can be middle class.

By making a college education more and more affordable, what we’ve done instead is make the competition to enter this hallowed group of middle managers more competitive, more vicious, and overall not everything we were promised.

We Pay for Infrastructure through Taxes

Atlas Shrugged
Can we refuse to carry our own weight in our culture? That’s the tragedy of the commons.

It’s always been true that we want to use the infrastructure (sometimes inaccurately described as “services”), but we really don’t want to pay for them. In economics, this is known as the free rider problem. When everyone uses a common resource but no one wants to maintain it, that’s called the tragedy of the commons.

To a certain extent, this tragedy always happens. It is one of the inefficiencies of any semi-capitalist system.

The problem is exacerbated in education because it is only partially treated as infrastructure. Because it is of economic value to users, as individuals, everyone wants to take advantage of it, but no one wants to pay for it. Even more so, no one wants to pay for its general use — just their own.

Funding Education

The larger question, then, is “to what extent should we make education part of the infrastructure (and therefore fund it)?” Insofar as it is designed to benefit the individual, individuals can pay for it. But as far as it is a necessary part of our culture, we need to come together and pay for it.

We already do pay for education through taxes up to the high school level. And while we do not pay for education past that level, we do subsidize it. These subsidies take the form a student loans and other tax breaks both for educational institutions and those paying them.

So there is a recognition on some level that this is, effectively, a “capital” investment in our future. In other words, we don’t think that education is a commodity. It is a necessity. Educated people are good for our country, good for our culture.

But there has been a trend, probably at least since the 1980s and maybe much earlier, to move education away from the model of the not-for-profit, and toward a business model. That’s one reason that administrators are so very well compensated for their hard work.

Education As a Capital Investment

Somehow, an idea has grown up that we get educated for our own sakes. It’s some Ayn-Randian, individualistic idea that we invest in our selves in order to make more money later. While there have always been schools like that, they’ve traditionally been trade schools.

The mission of not-for-profit education never was to turn out workers capable of functioning in a rapidly changing world. That is, certainly, one of their goals. It is their short-term “value added.”

But education, as the pursuit of knowledge, is more than just a way to train workers for this generation. Lost in the economic model of the corporation, we can best think of it as basic research and development on a cultural level. There is value in every liberal arts major, no matter how much the media snickers.

The Great Recession

One of the big effects of the recent economic downturn, the Great Recession, has been a trend away from education for its own sake. That is a movement away from the tradition of education for the middle class.

A core middle class value for a hundred years or longer has been education for its own sake. The middle class, as the middle managers and specialists of society, have held up the Western ideal that knowledge is power — that, in fact, all knowledge is power.

The price of college first went up in line with its market value. It has continued to rise as it moved from being a crowning achievement to a necessary prerequisite — the ante just to get in the game.

With the cost of a now-necessary college education spiraling upward, people are reasonably concerned with their individual, short-term return on investment. And few are looking at the long-term, cultural costs of this move.

Culture Is a Myth, or If You Prefer, a “Category of Analysis”

Over the past century and a half, as anthropology developed, the field began to understand that we needed a whole lot more objectivity if we were ever really going to understand what culture is.

We’ve spent significant amounts of time trying to develop more objective approaches. And we realized, eventually, that perfect objectivity is impossible in studying humans –because we, too, are humans, and have our own culture.

It all comes back to the original, perhaps unanswerable question:

What Is Culture?

Nuer Boy
Sir E. E. Evans-Pritchard, the famous anthropologist, probably didn’t have one-tenth the idea of what it means to be Nuer as this young boy.

The first, often unspoken truth of anthropology is that other cultures exist. When we look closer at them, however, we find that they don’t! “Culture” is a word that we use to describe things. It’s a tool of analysis.

Back in the old days of anthropology, people believed that cultures were complex but relatively unchanging (and often hierarchical) sets of behavior. Maybe what individuals did changed, but that was often seen to be the effect of outside influence.

Among academics, there was an assumption that outside the West, cultures suffered from some kind of social inertia. Later researchers came to understand that cultures don’t change, thus making people change. It’s quite the other way around. People make choices; cultures don’t. Cultures are an aggregate of behavior, belief, and interaction with the world. In other words:

Culture Is What People Do

On the face of it, already we know that cultures aren’t static systems of unchanging beliefs. We know, from history as well as from observing the world around us, that cultures change.

Modernization and globalization — words that conjure images of cultures “dying out” in the face of outside impact — aren’t strange monsters that attack from the outside. They are culture itself, under the effects of choices and changing technologies.

In other words it’s impossible, on the face of it, to accomplish the most obvious goals of anthropology. We can’t sit down and record what it means to be Diné, Indonesian, or Nuer.

Our lack of ability, as Western academics, to write it all down in ways that are both perfectly descriptive and coherent, comes from two grim realizations:

1. Culture is just a category of analysis.

Sir E. E. Evans-Pritchard (1902 - 1973)
Sir E. E. Evans-Pritchard (1902 – 1973)

Anthropology has a hard time even saying what exactly culture is. How can that be? Culture is what people do, and early understandings of culture approximated the things that are true about “people X” that make them different from us. (“Us,” then, was meant in the most Victorian sense possible: white, middle-class people with access to education.)

Cultures, cultural interaction, subcultures, and culture change are all ways of talking about something that is anything but static. Culture encompasses sets of behavior that are both too complex to understand with any one approach. An ethnography can’t help but take a snapshot of a culture: a single moment, and from a single perspective.

2. Culture, as a label, is polyvocal.

Wait, poly-what? Polyvocal means “many voices.” What it means here is that when we take a label and apply it to a culture, we’re really naming a bunch of related (and often conflicting!) aspects of behavior.

Canada 2010 Winter Olympics OT celebration
Is this all it means to be Canadian?
In a word, no.

It’s a hard idea, but here’s an example. Let’s say that we want to talk about Canadian culture. What does it mean to be Canadian? Beer, hockey, and politeness? Living in Canada? Can English Canadians and French Canadians both be Canadian?

All of those aspects are sometimes true. But there’s no magic formula to determining a person’s Canadian Quotient (CQ). Being Canadian, or being American, or being Nuer, is an experience of identity that goes beyond words, and is some combination of self-identification and social recognition.

In other words, culture is something that we can try to describe, try to understand, and try to make sense of. But in the end, it’s not a list of adjectives. It’s an experience.

The Good, the Bad, and the Whole

Claude Lévi-Strauss
Claude Lévi-Strauss (1908 – 2009)

It struck me today how much we struggle with taking the bad with the good. This isn’t just something that you or I can change by making a decision. It’s something that’s inherent in our view of the world.

We like some things, we dislike others. That is the way of things. And that’s fine, when it’s just you or me. But when we’re looking at whole cultures, it can become something of a problem.

Binary Oppositions

One of the famous, if now outmoded, theories of culture in anthropology is Structuralism. While Structuralism was an older theory, coming from linguistics, it was the work of Claude Levi-Strauss that brought the theory to the forefront.

One of the key ideas that invests itself in Structuralism is that all thought, all concepts, exist not by themselves, but as halves of pairs — a “presence” and an “absence.” These pairs are called binary oppositions. Over and over (the theory argues), we understand the world through pairs.

So, says Structuralism, we only understand good relative to evil. We only understand light relative to its absence, darkness. This is how, they argue, people (and cultures!) understand the world.

But There’s a Hitch

Okay, that’s all fair enough. Even if the theory’s not perfect, it can be a useful tool for understanding some aspects of life and culture. It might not be “the answer” to all things, but it’s an excellent observation.

While cultural theorists might throw it away as imperfect (theorists have a way of doing that), the rest of us can benefit from this observation to understand something more about the world. Sure, it’s not cutting edge; in fact, it’s rather ancient by the standards of social theory.

The rest of us still have something to learn from these binary oppositions. What we can realize is that such ideas as “good” and “bad” or even “like” and “dislike” are insanely complicated. There are physical aspects, psychological aspects, and cultural aspects all tied together in some kind of experience that we call “good.” Or “bad.”

By definition, we want more good, and less bad. If we were 19th century Scottish philosophers, we’d call that “progress.”

But when we say “more good, less bad.” we usually fail to recognize that the good and the bad are often tied together.

Trade Offs

For instance, it’s common knowledge (among anthropologists and other social scientists) that complex societies, like ours, require social stratification. That means, if we want to keep enjoying things like, oh, technology, education, health care, mass transportation, public roads, books, the Internet, money, public safety, and a host of other little “good” things, it’s going to be impossible to win any war on poverty.

So, we can’t just say “poverty bad!” “End poverty!” actually means “End society!” We’ll just have to assume that it’s a shorthand for “make the distance between the top earners and bottom earners less gross.” And sure, that could very well be a good idea.

In fact, I’d argue, the top earners seem to have fallen into a similar trap. “Money good!” “Power good!” they seem to say. Sure, all the money and power in the world feels good. But such a concentration of “good without bad” is probably unsustainable — or more accurately, will be very very expensive to maintain.

The point is that there are lots of things that we ‘don’t like.’ What we find, on examination, is that they’re often trade-offs for things that we like a lot. Most of us don’t like work so much that we’d do it for free, after all. But we “make the trade.”

Taking the Bad with the Good

There are a lot of things we don’t like it the world. When I was about ten, I thought it would be great if we just got rid of taxes. My father patiently explained to me all the things that taxes paid for that I used every day.

But there’s a broader application, and a deeper truth, here: we can’t get rid of the bad aspects of anything just by attacking it. We need to either get rid of the good parts too, or we need to improve the whole system.

If every day were an awesome day, we’d quickly become inured to its awesomeness and have to move on to more awesome things — or start complaining. That’s just human.

We’re people. We divide things into good and bad. That doesn’t usually mean that we get to pick and choose between them.

The People’s History

When the social sciences were first created, it was at a time when science was believed to be the answer to all things. It was a time when the physical sciences were growing by leaps and bounds, and people — at least the educated ones — knew in their hearts that the scientific method would crack the code of all things.

The idea was to take the methods of science and apply them to the behaviors of people. Anthropology, economics, political science, sociology, and the like all were ways of stepping away from softer approaches to humans, like history, and give the topic a more rigorous basis.

As it turns out, trying to use scientific rigor on complex systems is a lot harder than it looks. If we’ve learned anything, it’s that people are more complicated than we suspected.

The (Post?)Modern Human

Homo erectus endocast - Smithsonian Museum of Natural History - 2012-05-17
Are you thinking what I’m thinking?
Probably not.

People are complicated. So complicated, in fact, that we mostly understand them (ourselves!) through statistics and trends. Culture itself can only be understood in that way.

When we first started looking at culture, we really wanted to say “culture X believes Y” like it was some mathematical model. But now we have discovered that there is little in any culture that is truly uncontested.

Even what we once thought were simple cultures of simple people are incredibly complex, with “core” beliefs that are no more than trends. Even if we want to say something like “the Ancient Greeks were polytheistic” we can only say it with some certainty.

Complexity Isn’t New

Plato in Thomas Stanley History of Philosophy
from Thomas Stanley’s The History of Philosophy (1655)

Some of the Ancient Greeks believed that all the “gods” were expressions of one higher source. So, does that make them polytheists? Did they believe the myths we ascribe to them? Well, sort of.

Think of it this way: “do Westerners today believe in evolution?” Or how about “do Christians believe in the creation of the world in seven days?” The answer is “some of them” or maybe even “yeah, they trend that way in some sub-groups.”

To make things seemingly less complicated, we look at what beliefs are associated with other things, like political power. We choose to narrow what we’re looking at in order to make some kind of sense.

But at the same time, we have to realize that when we narrow down what we look at and see only dominant trends, we do the same thing that we do by choosing to watch only certain news sources in our own culture. Since we only have limited attention, we chop off a chunk of what’s going on, placing abstract boundaries that reflect our own values.

Like the postmodernists argue, we can’t pay attention to everything. And choosing what we pay attention to is an act of politics and an act of power.

Rise of the Machines

Daniel Maclise - Caxton Showing the First Specimen of His Printing to King Edward IV at the Almonry, Westminster
The printing press revolutionized History by making its creation accessible to all.

But we can no longer pretend that we’re passive consumers of culture and history. The “common man” has access to more education and wealth than ever before in history.

With the rise of the printing press, mass literacy, and more recently the Internet, we’ve given voice to people and groups that were previously silenced. Where once putting pen to paper was the domain of the elites, now it’s open to nearly everyone.

The cost of publishing has gone down to near-free, and the challenge is getting people’s attention in the resulting cacophony. The question isn’t “can I get published?” anymore. While the gatekeepers of traditional publishing still hold the keys tightly, there are other kingdoms of communication, and some of them aren’t so hard to enter.

Now that the means of communication are open to all, it is harder to cling to the old adage that “history is written by the victors.” History, as always, is written by the literate. But more and more, that’s nearly everyone.

Not only can we see how complicated and varied people are in their decisions and lives, but we’ve become engaged in documenting it in ways that couldn’t even be dreamed of a generation ago. Though we have hardly begun to scratch the surface of it, social media has become “The People’s History.”