Category Archives: Education

Race: Cultural Construct — And Very, Very Real

Anyone who has absorbed the content of an Anthropology 101 class, or even a general education class, knows intellectually that race is a fiction. What we don’t usually take away from these classes is the nature of “fiction.”

We think of “fiction” in the most simple, 6th grade terms — as “made up.” But as adults, it is easier to recognize that that a fiction is not the same as a lie — or even more simplistically “a random collection of disparate ideas.”

10,000 Monkeys

Better than 10,000 monkeys

But Shakespeare was not written by 10,000 monkeys on typewriters. Fiction is not random. In the same way, race does exist — as a part of culture.

The part of culture that we struggle with, especially our own culture, is that it is not coterminous with reality. But one thing that we know, as anthropologists, is that culture seems real. From the inside (and humans must by definition operate from the inside) we can’t tell the difference between reality and our perceptions of it.

The “crime” is not that we believe in race, but that we legislated it as if we bear the unitary truth of reality. The mistake, then, is not that our culture is “wrong” but that our expectations of culture are crazy.

A Teaching Moment

Saint Francis of Assisi by Jusepe de Ribera
Just another hotblooded Italian? No, that’s Saint Francis of Assisi.

When I was teaching Anthropology 101, I had a student in one of my classes who felt, very strongly, that his identity as an Italian-American was all the explanation that was necessary for his hot temper. No amount of argument would suffice to show him that what he was doing was embodying and reinforcing his culture’s expectations.

The discussion in class became quite heated, as he would not budge from the belief that his experience was not only valid, but indeed intrinsic. That, of course, is the power of culture.

The power of anthropology is not that we can step outside of culture and look at it. That is, I believe, impossible. We can bend and expand our culture, stretching our mind to extrapolate the nature of culture. But we can never wholly step away from it.

Anthropology: Proscriptive or Descriptive?

Anthropology is an academic discipline, and philosophically a powerful one. It allows us, from the inside, to stretch and turn our culture so that it can look at itself.

But we also know that it is possible, once we have begun these philosophical limbering exercises, to reshape not only ourselves, but culture as a whole. Like all science, anthropology lets us begin to reshape our minds in ways that let us begin to map the world the way it is.

But this idea, that we can reshape our culture to map the world, is based on the underlying cultural belief that culture and reality are the same thing. Believing that everyone should have an understanding of the world that matches an anthropologist is just as self-centered as atheists (for an example in the news) who believe that everyone should see the world the way they do.

All at once, it brings anthropology to the center and cheapens it. If anthropology is itself a discipline, then it has something to offer the world as it is. We don’t need everyone to be an anthropologist any more than we need everyone to be a doctor.

Instead, anthropology might consider focusing on what these truths we learn mean when we can present them without leading the listener to be an anthropologist themselves.

And if it is impossible for us to teach without leading the listener through every step of the whole process — if we can’t bring truth back for our culture — then that would explain the challenges we face as a discipline.

The difference lies in status. If anthropologists can access cultural truths and these truths are of some use, then we must teach from a rhetorical standpoint of ethos. If our only way of teaching is teaching others how to think, then we are in the business of changing culture, not serving it.

Race is part of Western Culture, and denying its cultural validity is an uphill battle. Instead, we must reach out to those who legislate and make decisions that cross cultural groups. These are the people who need to understand that everything they “know” about “the other” is from only one perspective.


…Now with the Power of Science!

I ran into a funny piece of advertising recently. It was for a facial cleaner or some similar product, and the product read, in big friendly letters (as part of the advertising on the front of the package):

Now with “science-y jargon blah blah“!

Okay, so I’m just paraphrasing the semantic meaning here. But, I’m pretty sure that we weren’t supposed to understand what it meant, just be impressed with its efficacy.

Science As Authority

Luckies Doctor
If “Science” says it, it must be true!

Using the idea of “Science” (the cultural construct) as a way of establishing authority in ads isn’t anything new. Even since science started being something that we listen to, instead of something we learn to do ourselves, marketers have been using it to affect our decisions.

Remember, in a perfect world, scientists have every obligation to share their data with other scientists. This is done for the sake of improving the world of science — to strengthen the academy. But for those of us on the outside — and with the incredible complexity of scientific research today, that’s almost everyone — we just have to take their word for it.

The New Alchemists

Gold Bars
Can science navigate the shoals of temptation?

The majority of scientists today work outside of the academy. They are in the employ of (not to put too fine a point on it) merchants. These merchants are themselves in the employ of shareholders, themselves. There are times, we can imagine, that this might present conflicts of interest.

It’s a trade-off. With much of research funding coming from private hands, and those hands being bound by (fickle) shareholder obligations, we’re making great strides in areas like consumer electronics and pharmaceuticals.

Patent wars are the result. And patent litigation is probably desperately against the scientific method at its core. The scientific method, at its root, is a collaborative method; that is its power.

But “scientists” working for private ends have a business model of making their patron happy as they try to create wealth through their knowledge. We’ve seen this model before — it’s the way that alchemy functioned in the pre-scientific era.

Investing in Education

The promise of a college education has long been entrance to the middle class. What we’re learning however, in a painful and indelicate manner, is that not everyone can be middle class.

By making a college education more and more affordable, what we’ve done instead is make the competition to enter this hallowed group of middle managers more competitive, more vicious, and overall not everything we were promised.

We Pay for Infrastructure through Taxes

Atlas Shrugged
Can we refuse to carry our own weight in our culture? That’s the tragedy of the commons.

It’s always been true that we want to use the infrastructure (sometimes inaccurately described as “services”), but we really don’t want to pay for them. In economics, this is known as the free rider problem. When everyone uses a common resource but no one wants to maintain it, that’s called the tragedy of the commons.

To a certain extent, this tragedy always happens. It is one of the inefficiencies of any semi-capitalist system.

The problem is exacerbated in education because it is only partially treated as infrastructure. Because it is of economic value to users, as individuals, everyone wants to take advantage of it, but no one wants to pay for it. Even more so, no one wants to pay for its general use — just their own.

Funding Education

The larger question, then, is “to what extent should we make education part of the infrastructure (and therefore fund it)?” Insofar as it is designed to benefit the individual, individuals can pay for it. But as far as it is a necessary part of our culture, we need to come together and pay for it.

We already do pay for education through taxes up to the high school level. And while we do not pay for education past that level, we do subsidize it. These subsidies take the form a student loans and other tax breaks both for educational institutions and those paying them.

So there is a recognition on some level that this is, effectively, a “capital” investment in our future. In other words, we don’t think that education is a commodity. It is a necessity. Educated people are good for our country, good for our culture.

But there has been a trend, probably at least since the 1980s and maybe much earlier, to move education away from the model of the not-for-profit, and toward a business model. That’s one reason that administrators are so very well compensated for their hard work.

Education As a Capital Investment

Somehow, an idea has grown up that we get educated for our own sakes. It’s some Ayn-Randian, individualistic idea that we invest in our selves in order to make more money later. While there have always been schools like that, they’ve traditionally been trade schools.

The mission of not-for-profit education never was to turn out workers capable of functioning in a rapidly changing world. That is, certainly, one of their goals. It is their short-term “value added.”

But education, as the pursuit of knowledge, is more than just a way to train workers for this generation. Lost in the economic model of the corporation, we can best think of it as basic research and development on a cultural level. There is value in every liberal arts major, no matter how much the media snickers.

The Great Recession

One of the big effects of the recent economic downturn, the Great Recession, has been a trend away from education for its own sake. That is a movement away from the tradition of education for the middle class.

A core middle class value for a hundred years or longer has been education for its own sake. The middle class, as the middle managers and specialists of society, have held up the Western ideal that knowledge is power — that, in fact, all knowledge is power.

The price of college first went up in line with its market value. It has continued to rise as it moved from being a crowning achievement to a necessary prerequisite — the ante just to get in the game.

With the cost of a now-necessary college education spiraling upward, people are reasonably concerned with their individual, short-term return on investment. And few are looking at the long-term, cultural costs of this move.

The People’s History

When the social sciences were first created, it was at a time when science was believed to be the answer to all things. It was a time when the physical sciences were growing by leaps and bounds, and people — at least the educated ones — knew in their hearts that the scientific method would crack the code of all things.

The idea was to take the methods of science and apply them to the behaviors of people. Anthropology, economics, political science, sociology, and the like all were ways of stepping away from softer approaches to humans, like history, and give the topic a more rigorous basis.

As it turns out, trying to use scientific rigor on complex systems is a lot harder than it looks. If we’ve learned anything, it’s that people are more complicated than we suspected.

The (Post?)Modern Human

Homo erectus endocast - Smithsonian Museum of Natural History - 2012-05-17
Are you thinking what I’m thinking?
Probably not.

People are complicated. So complicated, in fact, that we mostly understand them (ourselves!) through statistics and trends. Culture itself can only be understood in that way.

When we first started looking at culture, we really wanted to say “culture X believes Y” like it was some mathematical model. But now we have discovered that there is little in any culture that is truly uncontested.

Even what we once thought were simple cultures of simple people are incredibly complex, with “core” beliefs that are no more than trends. Even if we want to say something like “the Ancient Greeks were polytheistic” we can only say it with some certainty.

Complexity Isn’t New

Plato in Thomas Stanley History of Philosophy
from Thomas Stanley’s The History of Philosophy (1655)

Some of the Ancient Greeks believed that all the “gods” were expressions of one higher source. So, does that make them polytheists? Did they believe the myths we ascribe to them? Well, sort of.

Think of it this way: “do Westerners today believe in evolution?” Or how about “do Christians believe in the creation of the world in seven days?” The answer is “some of them” or maybe even “yeah, they trend that way in some sub-groups.”

To make things seemingly less complicated, we look at what beliefs are associated with other things, like political power. We choose to narrow what we’re looking at in order to make some kind of sense.

But at the same time, we have to realize that when we narrow down what we look at and see only dominant trends, we do the same thing that we do by choosing to watch only certain news sources in our own culture. Since we only have limited attention, we chop off a chunk of what’s going on, placing abstract boundaries that reflect our own values.

Like the postmodernists argue, we can’t pay attention to everything. And choosing what we pay attention to is an act of politics and an act of power.

Rise of the Machines

Daniel Maclise - Caxton Showing the First Specimen of His Printing to King Edward IV at the Almonry, Westminster
The printing press revolutionized History by making its creation accessible to all.

But we can no longer pretend that we’re passive consumers of culture and history. The “common man” has access to more education and wealth than ever before in history.

With the rise of the printing press, mass literacy, and more recently the Internet, we’ve given voice to people and groups that were previously silenced. Where once putting pen to paper was the domain of the elites, now it’s open to nearly everyone.

The cost of publishing has gone down to near-free, and the challenge is getting people’s attention in the resulting cacophony. The question isn’t “can I get published?” anymore. While the gatekeepers of traditional publishing still hold the keys tightly, there are other kingdoms of communication, and some of them aren’t so hard to enter.

Now that the means of communication are open to all, it is harder to cling to the old adage that “history is written by the victors.” History, as always, is written by the literate. But more and more, that’s nearly everyone.

Not only can we see how complicated and varied people are in their decisions and lives, but we’ve become engaged in documenting it in ways that couldn’t even be dreamed of a generation ago. Though we have hardly begun to scratch the surface of it, social media has become “The People’s History.”

Applying Anthropology

There are a number of fields of study that we can think of as “standard” in Western culture, from English to physics to history. People simply have a general knowledge of what these graduates do — and what their transferable skills are.

Physics major? Make them an analyst!

When students take physics, even as an undergraduate, we know that they can probably do math, extrapolate effectively from data sets, and think about problems in a relatively objective way. We know that they can manipulate numbers, think certain types of problems through, and understand complicated technical information.

English major? Make them an editor!

Students who major in English, on the other hand, have good vocabularies, can write, communicate, and improve other people’s communications. They know how words fit together for the “average” person, and can read effectively (a skill that we don’t think too much about, but which is actually both uncommon and useful).

History major? Make them a researcher!

When students have completed a history undergraduate degree, they will be well versed in plumbing written sources, analyzing sourcing flaws, and digging out some kind of truth — with the added ability to put together some kind of understanding of the reliability of the answers. These are the same skills that put everyone else’s google-fu to shame.

Anthropology major? Make them a…

But the cultural anthropology major? Anthropology, even more than sociology, is an esoteric topic. Anthropology has traditionally focused on whole areas of knowledge found outside of Western culture.

Where more traditional areas of study have focused on the core strengths of the West, from literature to science, anthropology has gone out and looked at the whole human experience, not just the “cool” stuff.
Anthropology looks at areas of knowledge that are difficult to even talk about in Western culture. They study the hard-to-study, from the religious beliefs of groups that don’t even have a word for religion to the nature of human interaction and the observer’s effect on it. These are areas that have often skated past the eyes of Western culture.

Transferable Skills

For the average anthropology major, even those who plan to go to graduate school and pursue a “life of the mind,” there will come a time when the wheel hits the road and it’s time to enter the job market. We know that the English majors can read (really read), the history majors can research, and the physics majors can perform (relatively) objective analysis.

How about the anthropologists? Well, as it turns out, there are a number of skills that a cultural anthropology major can do pretty well.

Read Technical Literature

Like I mentioned before, English majors are trained to read and analyze texts. Not only can the anthropology major do the same, but the text doesn’t even need a plot!

Research and Analyze

The topics that anthropology addresses almost always cross disciplines. An anthropology graduate can not only find out what people say, but also rate the sources of information (people, texts, or data) in terms of reliability.

Anthropologists have been trained to work from “what people say” to build frameworks that reflect whole ways of thinking. Further, they’ve been weaned from the two great fallacies of this kind of work: the belief that all people think alike, and the related mistake that “all people think like me.”

Combine Objective and Subjective Data

While the archetypal physics student above was trained in turning the physical world into numbers, his anthropology counterpart was trained in combining objective and subjective data. This is why some anthropologists are working in such varied areas as marketing and UX (user experience).

Collect Data that Was Being Missed

We’d like to think that a usual pattern for solving a problem is:

Analyze problem → Develop solution → Develop process → Apply process → Move on

While that model looks sharp, neat, and clean, it may not be terribly efficient in the long run. Anthropologists (those voracious technical readers) are be helpful in developing more complex processes: ones that have feedback loops, are self-correcting, and take advantage of the expertise of those actually performing the work.

An additional problem that anthropologists are trained to tackle (with the delicacy of an anthropologist) is that these “processes” may or may not address underlying challenges, hidden resistances, and “social” issues that don’t come to light immediately.

Analyze a problem → Develop a solution → Develop a process → Apply the process →
Test the results over time to increase efficiencies and determine how users have had to modify the process for “real world” application

The Takeaway

Anthropologists aren’t people who wear baggy, inexpensive clothes with lots of pockets and jewelry from around the world. We’re more than that. Our value added:

Anthropologists are risk takers who are trained to deal with complex and messy systems, real-world, non-static processes, and those pesky problems that always happen whenever primate reality meets carefully crafted abstract systems.

Science: Revolutionary, Heir, or Rebellious Child? (Part 3)

Punk with a Strongbow
Science, like Punk, was once about challenging the status quo.

Science was originally just a method for seeking knowledge. However, it has taken on an important role of authority in Western culture. Science hasn’t only driven innovation and increased our knowledge of the universe — all while improving countless lives on the way; it’s also changed the way the West sees both itself and the world. Science has become a place where many of us look for Truth.

Science’s role has been shaped not only by its own assumptions and discoveries, but also by the West’s much older relationship with knowledge, education, and public service. But sometimes there is a gap between the scientific community’s self-perception and the reality of the part it plays in Western culture.

The last two posts have addressed both the revolutionary aspects of science as it has affected the worldview of the West and the role of knowledge in shaping scientists’ elevated position in Western culture. But the gap between how science presents itself and its actual cultural role can create a certain amount of cognitive dissonance.

First, we need to distinguish between “science” as a method of research and “Science” as a symbol of cultural authority. In the laboratory (or any other research venue), a person uses science as a method for understanding the world. But once the knowledge leaves the scientific community, it becomes Science (with a capital “S”) — a tool of authority and rhetoric.

In the beginning, science was a challenge to old ways of knowing. It was the Punk movement of its day — a conscious revolutionary challenge to old ways of thinking and knowing. The attitude of “out with the old and in with the new” that sometimes seems to come with science — a preference for new ways of doing things in the face of tradition — is downright Punk. It’s a challenge to authority.

The authority of Science can be a very, very good thing. For example, knowing that we should wash our hands before we eat keeps us all healthier, and we don’t need to understand the details. We don’t need to understand germ theory, we just need to know that unwashed hands are “dangerous.” But doing things because Science says so isn’t “science.”

Science, the Rebellious Child?

The scientific community is a subculture of ideas and ideals. The members argue that to build a greater truth, they (and we) must create theories, test them, and share the results with one another. Knowledge that can be neither tested nor replicated is tossed in the wastebin of history.

However, as part of Western culture, the scientific community has gotten itself into an older game: the idea that there is one greater Truth — and that we can understand it. Science has long argued that in order to discover the Truth, we need to throw away the truths of the past. Under this schema, only objective, scientific truth becomes recognized as unchallengeable Truth.

Science’s response to other forms of cultural knowledge (based on non-scientific sources of authority) can sometimes come down to “I know what I’m doing” and “you can’t tell me what to do!” These aren’t scientific arguments, they’re rhetorical “appeal to authority” responses — albeit sometimes warranted ones.

How, then, does scientific truth interact with other sources of cultural truth? Oddly. But it does so through the same means as any other belief system: ideas are debated, or argued. Victory rests on the persuasiveness of an idea’s proponents.

The everyday understanding of the world that has developed from scientific research is encapsulated in every statement that starts “Science tells us…“. Under closer examination, such statements call on the cultural authority of “Science” rather than the weight of specific scientific research.

Science is great stuff. People, using the scientific method for research, have made fabulous gains in understanding the world and improving the lives of people. We might think of the iPod, but penicillin was much, much better.

Science seeks a more objective Truth. Their quest is stymied by two factors: science’s role as a cultural authority gets them involved in politics, and its dependence on funding controlled by vested interests makes people worry that their authority has been co-opted.

An examination of the way Science fulfills its role as an authority — as a purveyor of “the Truth” — is not a challenge to the scientific method. Science’s dependence on pleasing the people who hold the purse strings is. Anyone who tells you different is selling something.

Who Cares about Plagiarism?

With the rise of the Internet and the Information Age, plagiarism has become an arms race-like endeavor. Crafty students pay for papers, while the less crafty simply copy and paste from websites. In response, professors use new digital tools, paid for by the university, or simply plug key unlikely phrases into Google.

Thief cartoon
Plagiarism is not a joking matter.

College classes are about more than imparting information about specific fields. Students are often lured to school by the promise of high-paying jobs once they have earned some credential.

Instead, students find that they’re suddenly adrift in a world that only vaguely resembles the one they came from. They have crossed the threshold of the intellectual world.

And in the intellectual world, the rules of the game are different.

The Rules of the Game

It is traditional, in Western Culture, to divide experience into the worlds of the Mind, Body, and Spirit. You might have heard the phrase “life of the mind” in reference to working as a college professor — that’s the world they’re supposed to be introducing students to.

"Intellectual Development" (statue) by Hermon Atkins MacNeil
The world of the intellect follows its own rules.

In Western culture, priests and ministers give us guidance in the realm of the spirit, and professors are our guides to the “life of the mind.” (You can read more about the realities of the “life of the mind” on the Chronicle of Higher Education.)

Why is this important? Because plagiarism is a crime in the intellectual world. When a student, or a researcher for that matter, takes someone else’s ideas and uses them as their own, that’s theft. And in the intellectual world, it’s the theft of something valuable.

Together with the falsification of data and unethical research on human subjects, plagiarism is one of the three great crimes that can be committed in the intellectual world.

  • The development of human knowledge requires intellectual honesty.

Academia is based around the collaborative development of ideas. In dialectic, two people argue in order to create a greater truth than either could come up with on their own. But dialectic argument comes with a set of rules, and one of them is that the evidence provided needs to be both true and source-able.

Without the proper sourcing of information, there’s no way for anyone to judge the quality of an argument. In the West, academic discussions don’t exist ex nihilo; they rest on twenty-five hundred years of intellectual development.

  • When a student steals someone else’s ideas, they don’t learn what they were supposed to learn.

College is, at its best, about more than just learning to recite information back in order to prove that you’ve learned it. It’s a process of training the mind for rigorous thought, honest analysis, and a comprehension of the rules of intellect in the West.

In other words, higher education is not about saying the “right” things, but about being able to generate certain types of thought and use specific intellectual tools to analyze data and situations. Academic papers, conference presentations, and all of that are simply the proof of thought and a way of sharing it.

Plagiarism is parroting someone else’s words — but the words weren’t the point. The point was to think, and think in specific ways.

  • Breaking the rules of the intellectual world (plagiarizing) in order to gain recognition (passing a class, getting a degree, etc.) is like stealing a car to get to police academy graduation.

Some students will internalize these rules, and come to understand why they’re important. Some will simply follow a “when in Rome” policy and get through. But in the end, some students still can’t (or won’t!) understand why plagiarism’s a big deal.

What’s to Be Done?

Higher education, especially at the bachelor’s level, is (among other things) a process of socialization into intellectual honesty. Sure, some plagiarism in an undergraduate paper is the metaphorical equivalent of a kid stealing a candy bar. But, like stealing candy as a kid, there should be serious consequences.

Plagiarism should not be tolerated because it undermines the purpose of education and academia. Students need to learn that there are consequences. That doesn’t mean kicking them out of college for the slightest infraction. But it does mean that students need to apologize, feel intellectual shame, and correct the problem.

Being given a “zero” doesn’t teach a student anything. Kicking them out of school doesn’t teach them either — though it may be necessary for the good of the overall institution.

Like any other form of socialization, inappropriate behavior requires the deft hand of a teacher. There’s no one-size-fits-all solution for every student. But any student who thinks it’s “not a big deal” is missing the point of education.