Child of the West – A Personal Journey

Platon anagoria, Gastmahl Feuerbach, 1874
The myth of progress argues that where we’re headed is fixed already by the very nature of the world. But saying that it was “bound to happen” is just a way of dodging responsibility.

When I was an undergraduate, we were required to take “general education” classes. General Education was a series of core curriculum courses that addressed the nature of Western Culture, the basics of at least one other culture, and some of the challenges we faced in the modern era. The first class, GenEd 101, was a survey of the legacy of Western thought.

This course read from the Greek classics, the Bible, and Shakespeare, and honestly it was lost on me. As a freshman, I didn’t really understand what the professors were getting at. I simply didn’t have enough experience with the world to truly understand how these ancient texts had influenced my thinking. I certainly had no clue how much we owed to those who came before.

Out with the Old, In with the New

Worse, like many in our culture, I was a firm believer in the inevitability of progress — a true child of Western culture and scientific thought. I believed that the old should be thrown out so that the new could replace it, unaware that the “new” had not sprung forth in some blaze of truth, but had grown organically from the “old.”

The myth of progress is heavily influenced by 19th century thinkers. Metaphorically, it is the intellectual version of Manifest Destiny. We (as a culture) believe that progress is inevitable and wholly good. We do so with a faith as strong as any medieval Christian’s.

This same type of thought is reflected in every “evolutionary” theory of the world, from E. B. Tylor‘s writings that were so influential on early anthropology, to the horrors of “Social Darwinism” that led to the eugenics movement.

The New Myths

I learned in school that philosophy had come down from the Greeks, but I didn’t understand that there was a world out there that wasn’t in line with Greek philosophy. Or if I did, I just believed that my own culture’s ways were superior — without ever really understanding that I’d made that decision.

I was told that Western representative government had come from Greek democracy; I probably even wrote it on the test. But I didn’t “get it” — I didn’t understand why the tradition of democracy was important, because I didn’t really understand that there was a larger world out there, beyond the boundaries of modern Western thought.

It never occurred to me that the West came from somewhere, that it was an intellectual tradition that spanned thousands of years, and that it wasn’t even the only one.

For instance, I “knew” that alchemy was an old, boring superstition, and didn’t get that it was really the progenitor of the world of science that I studied. I understood the “scientific” roots of truth, but not the “cultural” ones. I knew that Isaac Newton, for instance, was a great mathematician and scientist, but I didn’t know that he was also an avid alchemist.

I didn’t understand that the ancient alchemists, the proto-scientists, were science’s forefathers — and that spitting on them for their “foolish” ways was nothing more than childish rebellion.

The More Things Change…

Understanding that “progress” is a value (rather than an inevitable fact) doesn’t mean that all change is bad, either. Feeling that we have to choose between inevitable progress and rank conservatism accepts a false dichotomy in which our own ability to choose is negated by the assumptions we make.

Clearly, the world does change, and every choice we make for ourselves is part of that. There is no inevitable future, so we have a responsibility to make the best one we can. Only by understanding and respecting our roots for what they are can we fully take part in that process.

Advertisements

Monkey See, Monkey Do

Mortarboard and Diploma
Symbols of the West’s educational system — a thinking cap and some writing

When we talk about “education” in the West, we’re usually really talking about one small slice of learning — formal education with its certifications, degrees, and specific areas of study. Yet this method of teaching doesn’t cover all learning. It simply covers one kind of learning, one culturally valued by us.

Western education, by and large, is focused on learning to think abstractly. Even when it addresses “practical” matters (making things and earning money, usually) it follows these same patterns of abstract thought. We study models of the world, and learn to apply them to our experiences. When we study economics or business, we start by learning principles, absorbing a whole cosmology that we can then apply to the world around us.

Nearly all Western education is focused around studying information, and then going on to study the patterns that the information forms. This is not, however, the only way to learn about the world. If we look outside of the Western world, we can also find another pattern of formal education.

Abstract Learning Vs. Experiential Learning

In the book, Ways of Knowing: Experience, Knowledge, and Power among the Dene Tha (1998) by Jean-Guy A. Goulet, the author looks at another approach to learning, as found among the Dene Tha of northern Canada.

Western learning follows the model of taking an experience and abstracting it, before passing on these abstractions to students, The students must then learn (often on their own) to apply it back into the “real world.” Learning is based on studying abstract commonalities that describe the world.

Especially in the area of religious learning, the Dene Tha do not follow this pattern. Instead, they teach by demonstration and replication. Such a method of study seems really hard at first to people raised in Western culture. That is because learning this way is a skill — it can be learned through practice and repetition.

Can you really imagine learning to use a piece of technology without first learning all of the terminology? What’s the first thing we learn about computers? Someone tells us where the on switch is — and by doing so, also informs us that this is called the “on switch” or the “power button” or something like that. They don’t just push the button and show us what happens. It’s so natural to us, we don’t even think about it.

In recent years, we in the West have begun to focus more and more on experiential learning as part of the educational process. For instance, that is part of the importance of the internship. It’s one thing to talk and think about information — it’s another thing entirely to get your hands dirty and get the job done.

Western Examples of Experiential Learning

The Dene Tha pattern of education — relying strongly on observation, repetition, and experiential learning — might seem odd and foreign to us. But if we look at the wider world, this teaching method is not limited only to one Native American group in Canada. In fact, this type of learning can be found around the world.

Perhaps more interestingly, and more importantly, it is found here in the West as well. In any area of study where there is an “apprenticeship,” this method of teaching comes into play; in any internship, we learn by observation.

For some types of learning, the process of abstraction is inefficient. While there might be a hundred little details to learn when we study something as basic as a martial arts punch, or a chemical tritration, with the Western model, we focus on the ten “important” ones, at least at first.

But which abstract rules are the important ones? That ends up being something that the teacher decides — and heaven help you if you and your teacher focus on parts of the training that aren’t the ones you need to work on yourself.

Repeat after Me

OKIData Microline 320
The printer looked a lot like this one.

At my first real job after college, I was put to work repairing printers and computers. Oh, I knew plenty (abstractly), but I didn’t have the actual experience that the job would require. So my supervisor and my boss got together, and devised a plan (though some might call it hazing, and they’d be right).

I was taken to a client’s office and left in a room with:

  • a screwdriver,
  • a printer,
  • the parts I needed to replace, and
  • an out-of-date instruction and repair manual.

The part I needed to replace was deep in the bowels of the printer, and I’d never had one like this apart before. What a nightmare!

The job did eventually get done. After an hour in there, sweating, cursing, and generally learning the shortcomings of my education, my supervisor waltzed in and showed me how to do the work.

I learned something important that day: all the abstract knowledge in the world isn’t really a huge help when you need to actually take that screwdriver and open up the darned printer. And when it comes time to learn such things, monkey-see, monkey-do is worth a thousand books.

The Face of Western Culture

America is often seen from afar as the “land of plenty.”

Wall Street
America’s wealth is legendary.
Perhaps even mythical.

I remember when I was doing research overseas, back in 2001, I had more than one conversation with bright, educated, young Indonesian men who saw America as the rich land — a land of wealth and opportunity. They might not have literally believed that the roads were paved in gold, but they might as well have. Their images of the “other” were just as warped by culture and narrative as my own. It certainly made for interesting conversations.

The imagined “America” they saw was far from the truth. It was an America imagined by Tony Montana, a place where ruthlessness and rough justice ruled. It was also a place of barely imaginable wealth. The questions came fast, “Is everyone rich?” “Does everyone carry guns?” “Can I work on my English with you?”

These young men didn’t know the realities of wealth disparity in America. Their ideas came from the images on television and in movies. Maybe not always consciously, but certainly unconsciously, these images have come to represent America. It is, in truth, a modern brand of Occidentalism.

Perhaps even more than politics, news, or history, our entertainment media builds the face that we show to the world. They are bards, not scholars, and tell stories of a country rife with greed, violence, and sex, where a manly man or a cunning woman (because our stories reflect our own cultural values) can take what they want through force or trickery. We tell these stories because they’re entertaining and escapist, but they are watched as a slice of American life.

The Media Bards

Hollywood Sign
Hollywood Hills, Land of the Imagined America

Our media, which over-represents not only our wealth, but also our violence and open sexuality, tells the stories heard overseas. No words of a lone anthropologist, even one who’s lived there, could sway these young men from those images.

And it’s not like those young Indonesian men were alone in this. For those of us who live in America, the same stories help shape our senses of self. We can imagine people around the world understanding America through the lens of such entertainment as Baywatch and The Replacement Killers, but how much more so do these images and narratives actually affect our own senses of self?

Edward W. Said’s Orientalism (1978) points out that Western images of the “Orient” have been historically shaped by colonialism and developed through literary narratives that were written by the West. These narratives explain exotic foreigners (in all their supposedly glorious and savage difference) to Westerners, and have often been mistaken for the truth.

But just as much as the power structures of the West have shaped our idea of the “East,” they have also worked in reverse, giving the world false images of Western Culture. The images that we consume, as well as export to the world, support these same narratives of hierarchy, unity, and wealth.

Occidentalism

Three musicians in period costume at a ren faire
The images we sell don’t always reflect the truth. The entertainment industry is interested in, you guessed it, being entertaining.

Edward Said shows that our culture carries false images of other countries, seeing them as some “other” — in a way that reflects and expands difference, ignoring commonalities. In the same vein, we carry these false images of ourselves, and share them with the world.

Our stories reflect the power of the Lone Ranger (in metaphor and sometimes reality) — the power of a single man to make a difference through violence and commitment to some version of “Truth, Justice, and the American Way“.

The “funny” thing is that we’re not trying to fool the rest of the world. We’re just trying to fool ourselves.

Culture and the Computer Metaphor

As humans, we live inside something we call “culture.” Culture tells us how we can interact with each other, giving the broadest outlines of interaction. Culture includes a lot of what we experience in the world. It includes religion(s), languages, careers, and all the tools we use.

The Matrix
The Matrix was cool because it blurred the line between the mind and computers.
And because it had really good special effects.

Sometimes, when we talk about how the human mind works, we use computers as a metaphor. Our sight and hearing are described as input devices. Our hands can do things: this is output. And we can interact with others — this is line networking, or even going on the Internet.

But we can take this metaphor of the mind as a computer even further. We can say that if the mind is a computer, then our first culture is its operating system.

What’s an Operating System?

Sand dunes
Silicon, used for computer chips, is often found as sand (SiO2).

Computers are, when you boil it down, specially made piles of silicon, arranged in ways that process information stored in “bits.” An operating system is the basic program that allows all other applications (also programs) to run on the system.

When we think of operating systems, we think of them as Windows, or Linux, or the various Apple products. We think of the part that we deal with, the U.I. (User Interface) as the operating system, but that’s not the critical part.

At a much deeper level, the important parts are the ones that allow us to send commands like “read the magnetic charge on a specific place in the hard drive” or “put a little dot of light on that computer monitor in place x.” Whether it’s a command line with a keyboard, or windows with a mouse, all the really important work is happening way below the surface.

In other words, a computer’s OS stands on the boundary of the computer’s virtual world and the physical world. Maybe our user interface looks this way or that, but the goal is to make the computer able to function.

Why Is an Operating System a Good Metaphor for Culture?

There are three traits of operating systems that, in metaphor, represent culture for humans.

  • Culture tells us how to interact with the environment.

Just like the OS tells the computer how to manage things in its environment, culture informs us how to interact with our own environments. Culture tells us how to make tools, and how to use them. Culture, through language and ritual actions, tells us how to interact with other people. Culture provides us with the basics of function and interaction.

  • Operating Systems are relatively opaque.
Schipset Sul South Bridge
Without an operating system, a computer is just a chunk of…well, whatever it is.

When we learn how to use computers, we learn how to function within an operating system. We learn Windows, but we don’t write our own code — we simply install it. And in a similar way, when we learn culture, it’s installed more or less whole. It’s not something we build from scratch.

  • We only get one culture as our first culture.

Like a computer’s operating system, the first culture we learn is the one we use to manage our own minds. That’s why the first culture we learn is through a process called “enculturation,” while further cultures we study are “acculturation.”

Acculturation is like installing a shell of another operating system. If we study really hard, maybe we can become functional in additional cultures, but (so it seems) that first one always informs our basic processes.

These three traits of culture are what make it especially hard to study. Like computer OS’s, it takes specialized training to learn how to work directly with one.

Most people who use computers aren’t interested in the operating system. Instead, we install applications that let us get about our work.

In the same way, most of the time we’re not interested in working on our culture. Instead, we use our culture to function, and go on to learn things within it — additional skills like driving, or accounting, or public speaking. Skill sets are like applications, run within the OS so that we can perform necessary actions.

Words of Power

Aristotle at the Louvre
Aristotle (384 BC – 322 BC)

Back in the old, old days, when the Greeks were playing around with this new idea called representative government, one of the key areas of education was rhetoric. The most important written work on rhetoric was aptly named Rhetoric and written by (or written down by) Aristotle, arguably Plato’s most famous student.

Rhetoric is, simply, the art of making arguments to convince people of things. As such, it can be a tool for good or evil, reflecting the ends to which it is put. The most famous part of Rhetoric (and one I’ve always relished teaching and slipped into just about every class I’ve taught) comes in Book Two: the three appeals — the appeal to logic (logos), the appeal to authority (ethos), and the appeal to emotion (pathos).

While Plato himself was ambivalent about the study of rhetoric (arguing in different places for its danger and utility), Aristotle places it as one of the three key elements of philosophy: logic, dialectic, and rhetoric. What we can take away from their disagreement on the matter is that rhetoric, as an area of study, is terribly powerful.

Modern Parallels

If we were to draw modern parallels to these practices, these fields of study end up looking like science, academic publishing, and marketing.

Logic is the process through which philosophers (thinkers) make observations and draw conclusions —  it’s the process of rigorous thought. Dialectic is what happens when philosophers get together and discuss their ideas — it’s something like intellectual dueling. And rhetoric is the way that people take their ideas and sell them to others — it’s about having a convincing argument, rather than the right one.

The scientific method is a matter of applying rigorous thought to the collection and analysis of data. That doesn’t mean that all philosophy is simply science in disguise, but it means that every scientist owes a debt to all the philosophers (writ large) out there. In the time of Isaac Newton, in fact, scientists weren’t scientists; they were “Natural Philosophers.”

We live in a world of sloppy realities. We’re not just the inheritors of  the Western tradition of thought; we’re also slightly-hairy primates with a penchant for territoriality, violence, and tool use.

That being said, the idea of an academic discipline — as in a disciplined way of thought and study — is a philosopher’s dream. Through education and repetition, we learn to marshal our thoughts and force them through what amounts to a QA process. Have you ever wondered why is education hard? It’s like teaching monkeys to march.

Dialectic is how trained philosophers engage each other in debate. This isn’t discussion for the purpose of winning, but collaboration for the purpose of generating a more accurate “truth.” It’s not a bargument, it’s a way of testing the validity of an idea across multiple perspectives.

Thinkers need to work together because any one human mind is too darned small to process the data of the whole world. It doesn’t matter how smart the philosopher, it’s simply impossible for someone to take in enough data to make perfect models of the world all by themselves. Dialectic is the “rules of the game” for engaging in productive debate.

Unlike mass publishing, the process of academic publishing asks questions about the quality and importance of ideas. Articles and books undergo a process of “peer review” by which the ideas are vetted before they’re given to a more general audience. Does that mean that everything published in academia is “true”?

No, probably not. But it means that at least a couple of experts agree that the ideas are sufficiently plausible to merit discussion. Again, this is part of a QA process for ideas.

Ivory Towers on Chancery Lane
This is a picture of the “real” Ivory Towers, now part of King’s College.

And that leads us back to rhetoric. Rhetoric is the field of study that bridges the world of the philosopher and the real world. It is the gate from the Ivory Tower down onto the streets of the city. No wonder Plato went back and forth on the matter! A philosopher might imagine a better world where we’re all philosophers…but we’re all just primates, here. The best ideas in the world will profit us nothing if we face better rhetoricians.

If logic is about generating good ideas, and dialectic is about refining those ideas through collaborative work, then rhetoric is about selling those ideas to non-philosophers. Without the gate of rhetoric, the Ivory Tower is sealed away, and is no more than a useless decoration.

The Specter of Rationality

One of the challenges of living in modern Western culture is the specter of rationality.

“Specter”? Indeed.

Western culture has (or had, some might argue) an expectation, provided by 19th century notions of progress, of everything in the world being explainable and definable. It was, in fact,  the promise of Science writ large — that given enough time and knowledge, the world would make sense and there would be a single, rational worldview that explained all things.

Thinking Straight

Human brain
There are biological limits to thought.

Rationality is the logical application of previous knowledge to new situations. It allows us to perform analysis, to block out irrelevant information, and to quickly make decisions that are, if not perfect, at least consistent with our experience of the world.

The trouble we face in approaching perfect rationality is that the human mind is only capable of knowing and processing so much.

Rationality depends on access to sufficient data, and no human mind is capable of learning, or even storing, every specific detail of the world. For that reason, we narrow down what information we gather. The fields of knowledge that we, either as individuals or as a culture, choose to study have a huge impact both on our kinds of analysis of data, and on the conclusions we reach.

I’ll try to paraphrase a quote I (probably mis-)remember reading in my teen years:

“Expertise does not cross fields. Unfortunately, this doesn’t stop experts from trying.”

It’s very common for specialists in any one area of knowledge to try to extend their expertise to other fields. For simple and obvious examples, we have Stephen Hawking making declarations on religion, or Pope Paul V’s condemnation of Galileo. However, this kind of messy thinking in the name of rationality happens every day.

Human culture has become incredibly complex, and so for any one person to master multiple areas of study is incredibly rare. Worse, the amount of “new” information being generated every year is growing exponentially.

At the level of mastery, specialization has become even more and more important. Why? Because the human mind has limits. We can only learn, store, and process a certain amount of information. There are basic biological limits to how much information we can store and process.

In order to get around the limitations of the human mind, we have developed tools to help ourselves. Sure, we use computers to store and process information — I’d be lost without Microsoft Excel — but literacy was an earlier and arguably more important invention. But the limits of the human mind still exist.

To put this another way: “knowledge” is often not just a set of information, but specific ways of thinking. For example, accountants think differently from engineers or plumbers, marketers or scholars of religion. Each group is trained to collect and process information differently, and that’s a good thing!

Each way of thinking makes specific neural connections, and channels of thought are created through practice and repetition. While people aren’t limited to only one way of thinking, each skill set takes time and effort to acquire.

We deal with this problem through specialization. An accountant, for example, applies a rational model to the intake, processing, and distribution of funds. This kind of rationality, while extremely useful, is limited in its application. And I’m not picking on accountants. This problem repeats itself in any area of study.

Rationality always depends on assumptions of cause and effect. Those assumptions are influenced by experience, by culture, and by specialization. Perhaps just as importantly, it is always bounded by the limits of data collection, storage, and processing.

It’s impossible for a human to be “perfectly” rational. The good news is that this can help us make sense of the world, know who to listen to about what, and make the best decisions possible.

The bad news is that we’re never going to live in a perfect, rational utopia. We’re going to continue to make decisions in that messy, organic way best represented by “office politics.” But hey, we’re primates anyway. It’ll be fun.

What Is Cultural Anthropology?

Aden Pattern Pith Helmet
We don’t wear these anymore.

The stereotypical image of a cultural anthropologist is someone who studies strange people in far-off lands. Cultural anthropologists are known to speak a couple of languages, slip between cultures like Santa Claus down chimneys, and tell wild (if sometimes oddly low-key) stories of far off lands.

But in the new world of near-instant travel and even faster communication, that doesn’t really even begin to describe what cultural anthropologists do. We’re all exposed to the Other now. The Other is not some strange and exotic thing, it’s just a part of life.

So what do anthropologists do all day? Mostly, they try to understand people by watching them live and work in groups. And those groups work together using a set of “tools” called culture. Except culture’s not just a set of tools. Culture is the meta-tool. It’s so ubiquitous, most times we don’t even know we’re in it.

If culture’s so big, how the heck do we even look at it?

Chopping Culture to Pieces

Anthropological theory usually starts by breaking down culture into mental categories. For example, we might make the divisions: Ethnicity, Language, Economy, Politics, Kinship, Gender, and Religion.

A quick look at that list will show that it doesn’t quite cover everything, and a lot of things fit into multiple categories. If a leader is always a man (or woman) in culture X, then does that fit into politics? Religion? Gender? It turns out, we can look at the same thing in several ways.

Worse, there comes a moment when we realize that these categories are based on our own culture — they’re tools we use to help ourselves understand, not actual, natural categories with an a priori existence.

It’s a humbling moment when we discover that these word-tools are actually a part of our own culture. Still, for understanding culture and its basic processes, they’re a good start. The starting place of the Western discipline of anthropology has to be Western culture. It is necessary, but it would be a shame to forget why we did it in the first place.

Development of Social Theory

Each and every area of study in anthropology has its own extensive theories, and not all of it agrees with itself. In many of the social sciences, that is often the case. How can this be?

Most people know more about psychology than they do about anthropology. For a moment, then, think about psychology and its various approaches to understanding people. While modern psychological practice and theory (take Cognitive Behavioral Therapy or Cognitive Psychology) do, in a sense, descend from the work of Sigmund Freud, it would be ridiculous to argue that we’ve just built on his ideas.

Instead, as Western culture’s knowledge has grown, psychologists have actually swept away some parts of Freud’s theories. In fact, we could probably break his theories into two categories: things that we think are true without questioning them, and things that we think are passé. We all “know” that the things that happen in childhood shape who we become (I hear that was his idea) but we think the id, ego, and superego might just be too simple a structure to describe us.

The same pattern holds true in anthropology, as it has developed over the past century and a half. We no longer believe that the first people developed culture as a way of figuring out which kids went with which father in a matriarchal society (MotherRight, J. J. Bachofen, 1861). Anthropologists no longer theorize that cultures develop through linear stages from band to nation-state. [We stole that construct from Neoplatonism, as mentioned here.]

Yet cultural anthropologists know culture is something that is learned. Moreover, it’s learned in such a way that it forms the basis of our experience of the world. We know that our area of study is as fundamental to human experience as human biology — almost mystical in its implications.