Who Cares about Plagiarism?

With the rise of the Internet and the Information Age, plagiarism has become an arms race-like endeavor. Crafty students pay for papers, while the less crafty simply copy and paste from websites. In response, professors use new digital tools, paid for by the university, or simply plug key unlikely phrases into Google.

Thief cartoon
Plagiarism is not a joking matter.

College classes are about more than imparting information about specific fields. Students are often lured to school by the promise of high-paying jobs once they have earned some credential.

Instead, students find that they’re suddenly adrift in a world that only vaguely resembles the one they came from. They have crossed the threshold of the intellectual world.

And in the intellectual world, the rules of the game are different.

The Rules of the Game

It is traditional, in Western Culture, to divide experience into the worlds of the Mind, Body, and Spirit. You might have heard the phrase “life of the mind” in reference to working as a college professor — that’s the world they’re supposed to be introducing students to.

"Intellectual Development" (statue) by Hermon Atkins MacNeil
The world of the intellect follows its own rules.

In Western culture, priests and ministers give us guidance in the realm of the spirit, and professors are our guides to the “life of the mind.” (You can read more about the realities of the “life of the mind” on the Chronicle of Higher Education.)

Why is this important? Because plagiarism is a crime in the intellectual world. When a student, or a researcher for that matter, takes someone else’s ideas and uses them as their own, that’s theft. And in the intellectual world, it’s the theft of something valuable.

Together with the falsification of data and unethical research on human subjects, plagiarism is one of the three great crimes that can be committed in the intellectual world.

  • The development of human knowledge requires intellectual honesty.

Academia is based around the collaborative development of ideas. In dialectic, two people argue in order to create a greater truth than either could come up with on their own. But dialectic argument comes with a set of rules, and one of them is that the evidence provided needs to be both true and source-able.

Without the proper sourcing of information, there’s no way for anyone to judge the quality of an argument. In the West, academic discussions don’t exist ex nihilo; they rest on twenty-five hundred years of intellectual development.

  • When a student steals someone else’s ideas, they don’t learn what they were supposed to learn.

College is, at its best, about more than just learning to recite information back in order to prove that you’ve learned it. It’s a process of training the mind for rigorous thought, honest analysis, and a comprehension of the rules of intellect in the West.

In other words, higher education is not about saying the “right” things, but about being able to generate certain types of thought and use specific intellectual tools to analyze data and situations. Academic papers, conference presentations, and all of that are simply the proof of thought and a way of sharing it.

Plagiarism is parroting someone else’s words — but the words weren’t the point. The point was to think, and think in specific ways.

  • Breaking the rules of the intellectual world (plagiarizing) in order to gain recognition (passing a class, getting a degree, etc.) is like stealing a car to get to police academy graduation.

Some students will internalize these rules, and come to understand why they’re important. Some will simply follow a “when in Rome” policy and get through. But in the end, some students still can’t (or won’t!) understand why plagiarism’s a big deal.

What’s to Be Done?

Higher education, especially at the bachelor’s level, is (among other things) a process of socialization into intellectual honesty. Sure, some plagiarism in an undergraduate paper is the metaphorical equivalent of a kid stealing a candy bar. But, like stealing candy as a kid, there should be serious consequences.

Plagiarism should not be tolerated because it undermines the purpose of education and academia. Students need to learn that there are consequences. That doesn’t mean kicking them out of college for the slightest infraction. But it does mean that students need to apologize, feel intellectual shame, and correct the problem.

Being given a “zero” doesn’t teach a student anything. Kicking them out of school doesn’t teach them either — though it may be necessary for the good of the overall institution.

Like any other form of socialization, inappropriate behavior requires the deft hand of a teacher. There’s no one-size-fits-all solution for every student. But any student who thinks it’s “not a big deal” is missing the point of education.


Maslow and Materialism

As the third part of an examination of materialism in Western Culture, we move to the realm of psychology. If Marx’s influence is still felt in the world of rhetoric, then we only need to take a look at Abraham Maslow for materialist influence in psychology and on identity. Although a psychologist himself, one of his mentors was the anthropologist Ruth Benedict — famous for her psychological anthropology research as part of the Word War 2 war effort.

Maslow’s Contributions

Abraham Maslow
Abraham Maslow (1908 – 1970)

An American psychologist, Abraham Maslow proposed a now-famous analytical tool that is commonly referred to as “Maslow’s Hierarchy of Needs” (1942, “A Theory of Human Motivation”). That wasn’t his only contribution — he is attached to other common concepts that worked their way into the popular consciousness, such as “self-actualization” (originated by Kurt Goldstein, but developed by Maslow) and “peak experiences.”

Peak experiences are moments when people, as individuals, act harmoniously with both themselves and their environment. Self-actualized people are described as those who have more frequent peak experiences than others.

Maslow’s work is a conscious response to Sigmund Freud and Freudian theory. He felt that Freud looked at the etiology of “sick” people, and so Maslow determined to look at the features of those who are “healthy.”

Another part of Maslow’s approach was that there was a place for spirituality in the healthy person. If that’s the case, then why even look at his work when discussing materialism? Because by bringing spirituality into the mix, Maslow managed to flip the Great Chain of Being on its head!

Mazlow's Hierarchy of Needs
Maslow argues that material factors are a necessary condition for self-development and personal growth.

“Maslow’s Hierarchy of Needs” as an Explanatory Model

You might be familiar with  Maslow’s Hierarchy of Needs. This model of human needs is culturally bound for sure, but at the same time it can help us make sense of our own lives. The Hierarchy of Needs is a useful tool, especially when it comes to imposing order on the mental landscape.

Maslow might not have offered the “be all and end all” of human understanding, but his approach does offer some helpful insight. One aspect of his hierarchy is that it actually tells us what needs to be accomplished before we move on to the next stage — if we accept his underlying assumptions about the nature of people.

Maslow’s hierarchy says that we must handle physiological concerns before we handle issues of safety, and safety before we handle love and belonging, etc. on up the hierarchy. But system has an underlying materialist assumption to it.

By placing human needs in a hierarchy of discrete categories, the hierarchy avoids the ways that these needs interrelate with one another. It says, “material needs are more important to humans than emotional ones, and emotional needs must be met before one can move on to such aspects as “morality.” It implies that “a hungry person is an immoral one.”


The Hierarchy of Needs, by its very assumptions, says that the material world is more fundamental than the emotional or spiritual. That is arguably the case in some situations, but at the same time, it probably an oversimplification of humans as a whole.

Following on the theme of Does It All Really Come Down to Economics? and Materialism and Science, I think it is worth pointing out that the underlying metaphor of Maslow’s hierarchy is materialist itself. Where it escapes from Marxist materialism is that it actually looks at the development of the self beyond just the material.

The Hierarchy of Needs admits that humans “do not live on bread and water alone.” Despite this admission, the hierarchy posits that such material needs are

  • a) more fundamental to human existence than others, and that they are
  • b) separable from these other needs.

By its very presentation, the Maslow’s hierarchy argues that basic needs are not tied to work, belonging, self-awareness, and awareness of a greater good.

On the one hand, Maslow’s materialism does provide a more comprehensive model of human experience than the Marxist analysis. On the other hand, it uses materialism as an analytical tool to simplify humans.

If I stretch the assumptions of his stages to representative categories of being, his model says that humans are first biological organisms (needing physical sustenance), then animals (with emotional needs), and then social beings who need companionship (like primates), and finally true humans (who try to accomplish great things).

Maslow and the Great Chain of Being

Maslow’s Hierarchy of Needs, in metaphor, turns the assumptions of the Great Chain of Being on end but keeps them all in place. It says that the most fundamental aspects of the person are the biological, and works its way up from there to spirituality at the top. His model does not, however, disrupt the Great Chain otherwise.

Where the original Great Chain of Being (with its Neoplatonist origins) assumed that the spiritual/mental was the most fundamental part of human nature, and that the physical world was the least important, the hierarchy simply changes the assumptions to materialist ones and says that the physical is more fundamental.

If Marxist analysis argued for positivist materialism, then Maslow moved away from that perspective to one where materialism was not the only worldview of worth — just the most important one. Materialist needs (such as food and shelter) are given a position of importance fully in line with a materialist interpretation of Western Culture.

Materialism and Science

My last post, Does It All Really Come down to Economics?, discussed materialism. Specifically, it looked at the way that Marxist social theory assumes — for analysis purposes — that the only valid worldview is one that describes the only material world. By “material world,” we’re not just describing rocks and sticks and mountains, or the forces of gravity.

9-pounder gun
Pre-democratic voting machine

We’re looking at such ideas as Mao’s “all power comes from the barrel of a gun” which implies the invalidity, for instance, of the foundational idea of American Democracy (quoted from the Declaration of Independence)

“…Governments are instituted among Men, deriving their just powers from the consent of the governed…”

Mao’s suggestion is that guns are more powerful than votes — directly opposing the American value that a gun’s true purpose in the hands of citizens is to maintain the validity of the government. [We can argue how that works in practice, but that’s the idea.]

Materialism, writ large, implies that “might makes right.” It implies that things are only what they seem, and that there is no greater meaning. On the other hand, I would argue that one of the purposes of culture — any culture — is to allow people (as social animals) to work together so that the world is more than Tennyson’s “Nature, Red in Tooth and Claw.

Yet pure materialism, as way of viewing the world, isn’t only used in social theory. Similar assumptions are made in any “hard science.” However, the materialist worldview that we associate with science wasn’t a philosophical choice; it was a practical one.

Scientific materialism is a necessary assumption as scientists attempt to reduce experimental data to a value-free “material world.” Assuming materialism reduces the number of variables — a necessary part of experimental design in any scientific research.

What Is Science?

Science is a tool for research.

Science isn’t about developing a new iPhone or computer chip. Admittedly, science has made these amazing things possible through the field of engineering. But that’s not’s science’s purpose.

Western culture’s “science” (from the Latin scientia: knowledge — but going all the way back to Proto-Indo-European’s skey- “cut, separate, discern”) is a broad term for the way that we expand humanity’s knowledge of the world around us. Science is not necessarily “high tech,” although the boundary might seem blurry — in the popular mind, the two are intertwined.

The purpose of science is growing human understanding of the world. In our everyday cultural perception, science is wedded to technology. In truth, science is a technology — a technology of the mind.

At its root, science, however, is a method of inquiry. In doing scientific research, a person performs experiments and then shares the information through peer review. Science is a method, a tool made of pure thought. It is defined by its purpose, not by the physical setting or the laboratory tools used.

The scientific method is a way for people to seek the truth. It is in the same category of mental tools as the dialectic, in that it is a way for people to collaborate in order to increase knowledge for society as a whole.

Living in the Material World

Though we can chase a wholly material worldview, it’s not likely that we’ll succeed in creating one. It has been shown by anthropologists (partly in response to Marxist, materialist analysis) that truly value-free analysis of the world is impossible. It is impossible for us to make sense of the world except through culture.

Let me put it another way: culture’s a feature, not a bug that gets in the way of “proper” scientific analysis. Science can allow us to see past our own biases in limited circumstances (more or less laboratory conditions). But to imagine that science is separate from culture, and somehow more pure, is to fail to recognize that science — like every other tool from the digging stick to the iPod — is a product of culture, not a priori of it.

Does It All Really Come Down to Economics?

In Western intellectual circles, it is sometimes easiest to argue convincingly if we base our underlying assumptions on the material world. By using the material world as a base — often materialist philosophy, economics, and positivist science — we speak a “lower” truth . It’s a trend worth noting, especially when we talk about such topics as modern rhetorical techniques.

One common assumption, often used as a tool of analysis in the social sciences, is the idea that cultural decisions are made for economic reasons. This relates to the concept that ideas can only be judged based on their results, and that the most basic level of analysis is the material world.

Materialism is an intellectual version of “might makes right.” It throws aside Western culture and replaces it with a devotion to a lowest common denominator that must be true in all times and places.

The Influence of Karl Marx

Karl Marx
Karl Marx, 1861 (1818-1883)

If you’ve suffered through 6th grade history (I had an awesome 6th grade history teacher — Thank you, Mrs. S.), then you probably know Karl Marx as one of the two writers who came up with The Communist Manifesto. And if you’ve spent much time in the social sciences, then you’ll be aware of his contribution there: the idea that pretty much everything is determined by what are effectively materialist and economic factors.

Marxist analysis” is a reaction to Hegel’s dialectic. Hegel’s dialectic is the dialectic of ideas: two ideas clashing, and then eventually unifying into one.

Marx argued that the process of clashing and then unification would take place not in the world of ideas, but in the material world. This was a major philosophical change, but what we want to take away is this: Marxist analysis is materialist.

The worldview of Marxist analysis rejects both religion and intellectualism as valid worldviews. The traditional mind/body/spirit division of Western thought is thrown out,  and everything is subordinated to the material.

Rhetoric and Analysis

While Marx and Engels’s conclusions are, from a modern perspective, ridiculously Utopian, it’s their reliance on the material as a source of authority that is their lasting contribution to Western culture. It’s a complete rejection of the philosophical history of the West — similar to the one that was going on in science (writ large) at the same time.

Any time we see an argument that compares two ideas’ values based on their material consequences, that’s at least partly Marx’s influence. In fact, any time we make an argument about how something will play out in the “real world” (material world), we’re leaning on Marx’s philosophy for support — and we probably don’t even know it.

Marxist analysis is a useful tool in some ways: by moving our analysis to the “material,” we theoretically make it possible to add less cultural bias to our analysis. But there’s a flaw the arises as soon as we start to actually believe in the materialist model.

All analysis happens in the world of ideas. If we argue “it’s true that the world of ideas is secondary to the physical world” (as Marxist analysis does), then we’ve just shot ourselves in the collective foot. If the underlying assumptions of our analysis determine that analysis is unimportant, then where do we go from there?

Coming Full Circle

I believe the answer is that we need to challenge the materialist assumptions that we make. We need to remember that Marxist analysis is a tool (a scientific tool at best and a rhetorical tool at worst), and not the one and only effective model of the world. While such analysis can be useful, we need to leave it subordinate, and not reify it as the one true model of the world.

Marxist analysis, like materialist science, cannot do more than tell us the way things are (and, I would argue, lends only one perspective). Such perspectives can be incredibly useful, but don’t even begin to touch on what we should do with our knowledge.

Even if we argue Marxist analysis reveals truth, knowing “the Truth” doesn’t tell us what’s “Right.” For that, we (as Westerners) will continue to rely on Western culture.

[I’m starting a new reading project: Julius Caesar’s Gallic Wars (in translation). You can find my ongoing comments on my What I’m Reading page.]

Random Isn’t (always) Random

In everyday conversation, we use the word “random” to explain a whole host of different situations. We say random when we mean “no discernible pattern.” But on closer inspection, “random” is just some fancy version of “there are too many variables” or even “I dunno.”

A man in bad weather
“It’s wind, man. Blows all over the place.”

Think about it this way. What could be more random than rolling some dice? But in a perfect, virtual world, we could calculate all the physics and variables, and know what the dice would bring.

We know that’s not realistic. In the real world, it’s just a “toss of the dice.”

We can imagine the same thing for predicting weather. We all know the weather’s hard to predict because it’s a whole bunch of variables and half-understood systems all mashing together to produce results. Or, as the guy said about the weather in The Weather Man, “I don’t know. It’s a guess. It’s wind, man. Blows all over the place.”

The Magic Power(s) of Statistics

So, if we live in this complicated world, how do we make sense of it at all? The long answer is careful analysis and the scientific method, of course. The short answer is often previous experience, “gut” instinct, and a fair amount of bravado.

Is a roll of the dice truly random?

But there’s a stage that falls somewhere in between the two. A basic understanding of the field of statistics helps with the scientific analysis of data (and any other data), but it has two other important advantages.

The first is that it allows us to make clear and rational decisions even with limited data. That’s not because stats makes you a supergenius, but because studying stats gives you new schemas for understanding not just data, but also different types of data.

We make decisions without enough data every day. We “guess.” In many cases, it’s not even that we have to make the best decision, but that making any decision will allow us to progress. After all, in the real world, we’re not wedded to every decision. We can always change our minds later.

The second is that a clear understanding of statistics prevents us from being hoodwinked by massaged data. That’s right! Not only can statistics help us make good decisions, it can also stop us from being fooled into making bad ones.

Once we have some experience with research design, such classics as “seven out of 10 doctors surveyed love this product” means a whole lot less. We begin asking questions like, “how was the sample group selected?” and of course “Which 10 doctors were surveyed?”

In the past, the study of Latin, Greek, and French were necessary for a person to be considered educated. In the information age, a working knowledge of statistics, at least to the point that we know a p-value from a piece of pie, is invaluable.

History Vs. Prehistory

Caught up in our everyday lives, we often forget what a broad, unexplored, and often unexplorable country the past is. Here are a few thoughts and ideas to help us make sense of how much has already happened.

Father of History

Herodotus - Metropolitan Museum of Art
Herodotus – The Father of History (circa 484–425 BC)

Herodotus (c. 484–425 BCE) is often considered to be the progenitor of history, or at least Western history. Herodotus’s Histories date from the 450s to the 420s BCE. We can figure it’s been about 2,450 years since first publication. In the Western Tradition, this is the beginning of “history.”

The Histories (the title in Greek means “inquiry” and is where we get the word “History” — from the Greek ἱστορία) are a far-ranging investigation into the causes of the Greco-Persian wars. They cover the events, the geography, and ethnographic information relevant to the inquiry.

Herodotus’s epithet was first conferred by Cicero (106 BCE – 43 BCE). There have, however, been some advances in the field of history since then. While we call Herodotus the “Father of History,” the general definition of history as a discipline requires that there be written records to study.

As more discoveries are made, the date for the possible origins of “History” is pushed back further and further. The earliest of written records might be the Dispilio Tablet, dated to 5,350 BCE (or 7,300 BP). That means that we can push the envelope for “history” back thousands of years — even though we can’t actually read the tablet.

Putting History in Perspective

Dispilio tablet text
The text of the Dispilio Tablet, dated to approximately 7,300 BP

Often, we use the field of history to put modern life into perspective.

You’ve probably heard some variation of the quote, “Those who cannot remember the past are condemned to repeat it.” We often take this to mean that we must understand history — but we ignore, perhaps because of what we feel is insufficient data, the terrible fact that most of the human past comes before history was even a glimmer in a scribe’s eye.

Anatomically modern humans appear in the fossil record about 195,000 years ago in Africa. For argument’s sake, if we take Herodotus as the originator of the discipline of history, then all of history is only about 1.32% of the existence of Homo sapiens.

That’s right. History is less than 2% of human existence — WAY less than 2%. In terms of the existence of humans, history could be little more than a statistical error — though I like to think it’s not.

But there have been advances in history since Herodotus, right?

History is linked to writing, so let’s push the date back to the earliest writing we have — arguably the Dispilio Tablet. With a date of 7,300, writing’s only existed 3.74% of the time that people have.

What does all this mean? Trying to understand humans by looking at “history” is (metaphorically) like trying to judge the history of America by looking at only the last 3 years, one month, and two weeks.

Even if we push back the beginning of “history” to the earliest example of writing, a serious stretch of the imagination, the metaphor tells us that we must judge all of America on only the last 8 years, 10 months, and about 10 days — not even as long as we’ve had Twitter.

To Be Fair

It might seem that I’m implying that history isn’t very old just because it only covers a small part of the human past. Just as an exercise, let’s think about how many generations ago the earliest date for history is.

Let’s say that a generation is 20-30 years long (they’re probably shorter on average), and then decide to make it a nice 25 years for easy math. That means that Herodotus wrote the histories a whopping 98 generations ago.

Yes, there are nearly 100 generations of history. From that perspective, history seems as old as it should — and regains its relevance. Of course, that also implies, for round numbers, that anatomically modern humans have existed for a staggering 7,800 generations.

Teaching: Building Schemas and Personal Style

The first semester I ever taught intro to cultural anthropology, a student came up to me at the very end, turned in his final, and explained how he found the subject matter boring. (He also thought I should be a stand-up comedian — so he let me off the hook on that one.) I had a few different reactions to that, one after the other.

First Thought: “He’s Right.”

From the outside, people think of anthropology as exciting (if not terribly lucrative). Anthropologists travel to far off places, wrestle with the “big” questions, and seek answers in places that angels sometimes fear to tread. Sounds pretty exciting, huh?

A basic cultural anthropology class substitutes reading for travel, challenges assumptions but rarely provides firm answers, and spends its time trying to feed students enough information to challenge their underlying assumptions about the nature of the world in which they live.

Anthropology 101 is necessarily thick with jargon, difficult and challenging concepts, and — above all — plenty of reading. While it’s only an introductory course, it’s introducing a whole new way of seeing the world. Doing all of that in just 14 weeks makes it challenging for the professor as well.

So, yeah, the student was right. The first uncertain steps into a larger world are, indeed, boring.

Second Thought: “That’s Okay.”

When we can make education entertaining, it’s definitely a plus. Especially when challenging students’ assumptions about the world, a sense of humor and some compassion go a long way.

On the other hand, college teachers aren’t professional entertainers. The material should be as engaging as we can make it, and yes, that means keeping up with technology and current trends in culture.

At the same time, teaching anthropology is about introducing students to information that is intellectually interesting. But that kind of interesting isn’t always as captivating as Here Comes Honey Boo Boo or even Downton Abbey. At our best, we’re generating a passion for learning.

Third Thought: “Teach the Way You Teach”

As a new teacher, it wasn’t easy to hear that the field I had spent years studying was boring. But on the flip side, the beginning studies of every field are “boring.” They have to be. It’s all jargon and new ways of thought.

So how do we help students past this hump? For new teachers, reading books on how to teach can help. Working with mentors regarding developing your own teaching style also helps. Even, and perhaps especially, socializing with other teachers can give a new teacher a boost.

The “trick” (or at least my “trick”) to teaching, I believe with all my heart, is developing a personal style that fits not just your target student population, but who you are.

Teaching is about connecting with students. It’s about leading them on the path to knowledge, not standing on top of a metaphorical hill and shouting for them to “come this way!”

Leadership means developing some level of trust, and that means “creating” a stable intellectual environment for them to explore. It also means making it enticing enough for them to engage.

I believe that teaching is always personal. Teaching isn’t just about the information, but about the connection. We talk about “mentoring relationships” with students, but in a sense every one of those students is being mentored. If they’re not, they might as well just go read a book.

I found that, at least for my teaching style, I had three main methods for keeping the students engaged:

1. Be passionate about the material.

I know it’s a cliche, but if the teacher’s not excited by the material, the students aren’t going to be, either. Maybe it’s something you’re born with, maybe it’s just having been bitten by the “teacher bug.” Who knows? But if you’re having a good time with the students, it all works out so much better.

2. Remember (and remind the students) that you’re not just imparting information.

Teaching is a form a of leadership. For a couple of hours a week, we lead students into new areas of thought and introduce them to knowledge that will change them. With that leadership comes responsibility, and acceptance of that responsibility means developing trust. This is especially important in cultural anthropology, where the topics can sometimes veer into the incredibly personal — race, child marriage, female circumcision, religion, and war can all be pretty challenging topics even before our students’ personal experiences get involved.

You’re helping them address the big questions of life. Remind them that what they’re learning is a necessary step…but that it leads them where they want to go.

3. Accept feedback for what it is.

If you’re a teacher, you know how invaluable the right feedback can be, and how painful the wrong feedback is. Teachers sometimes have to brush off what can seem like harsh criticism, knowing that their only shield is that their boss has also been a teacher. Every teacher I’ve ever known has received bad feedback from one student or another. And I’ve never known one who wasn’t upset by it.

This, of course, is the danger of making one’s teaching style personal. But trying to teach by rote, rather than delivering students what they need, seems like a terrible waste of a vocation.