Category Archives: Rationality

Science: Revolutionary, Heir, or Rebellious Child? (Part 2)

Dergoumidas before the Grand Vizier - by Giovanni Antonio Guardi
“Dergoumidas before the Grand Vizier” — Knowledge is one of the currencies of power.

It’s very easy for us to think of Science as a discrete part of Western culture, yet somehow transcendent at the same time. Perhaps that’s true as far as it goes, but understanding Science means looking at more than just how it’s different from the previous times in Western history.

Seeing science clearly requires understanding that it is not only a new way of learning about the world, but also heir of a tradition that goes all the way back at least to Ancient Greek philosophers, and more likely to Ancient Egypt.

Science is more than a method of collecting and analyzing data; It’s also a part of Western culture. Science is not just a way of thinking that replaced Alchemy, but also is alchemy’s cultural heir. More, in some ways science is heir to the whole tradition of the Western educational system.

Science, the Heir

We’re very used to thinking of Science with its triumphal plot arc: led by men who broke from the past and created a new world of rationality and order. But to understand the cultural role of science, rather than just the method, we need to comprehend the part that knowledge has played in the West going all the way back to the beginning of Western civilization.

As described in my last post, Science (writ large) rose to cultural power in the times following the Black Plague in Europe. It was a time when the Church’s cultural and political power was, after a thousand years, sustaining serious questions to its credibility. But Science didn’t spring out of nowhere. More accurately, it was an outgrowth of the work of the scholars of the West.

The growth of Western knowledge of the world had long been limited in certain ways by the political and religious realities of the power of the Church. The rise of science came at a time when the hierarchy of power was being shaken.

Yet the institutions of Science did not develop in a vacuum. Instead Science built on preexisting infrastructure, such as the institutions of learning that had been associated with the Church. Such growth made them bedfellows with politicians and kings.

Like other teachers (Latin: Doctor) of knowledge, there was a responsibility to advise political leaders when called upon. We sometimes focus on research and the scientific method when we try to understand science, but it is just as important to understand its relationship to the rest of society.

Science, the Grand Vizier

Fiction is the new mythology, and it gives us a window into otherwise hidden aspects of culture. Have you ever noticed the way science is treated in Western fiction? For the most part, it’s not very science-y. Most treatments of science have little to do with the research role of the scientist, and everything to do with their social role.

Though books are more likely to take the time explaining it (and there are whole sub-genres of science fiction that treat scientific research and discoveries with some consideration), in most movies and television, Science is treated as a discipline of almost magical wonder.

In fiction, Science is the source of MacGuffins of wondrous power, the way characters turn “can’t” into “can,” and an almost miraculous heal-all. Even where fictional science is used to solve problems, it is presented in mythological terms. The way science is shown inverts the actual pattern of scientific research.

In the real world, science is used to explore the unknown. In fiction, by contrast, science serves to drive the plot forward. The “Scientist” archetype has less to do with Einstein, and more to do with Merlin. The fictional Scientist takes the role of the wise advisor who understands the nature of the world.

Science not only devotes itself to furthering human knowledge, but also serves society. It serves by advising leaders, using its superior knowledge of the world. This pattern places it firmly in the same tradition as a thousand stories of King’s Counselors and Grand Viziers. In the US, the National Academies fill this role.

By acting as learned advisors of kings (and presidents), the scientists have taken over the role that alchemists, astrologers, and other old scholars once held. This isn’t the scientist in the laboratory, but the scientist who stands behind the throne.

Yet the learned have often counseled the leaders of the world. In some ways, science ushered in a new era of knowledge, but in other ways, they have provided a new face to an old role: the wise advisors of beneficent leaders, helping them to make decisions that benefit all people.


Random Isn’t (always) Random

In everyday conversation, we use the word “random” to explain a whole host of different situations. We say random when we mean “no discernible pattern.” But on closer inspection, “random” is just some fancy version of “there are too many variables” or even “I dunno.”

A man in bad weather
“It’s wind, man. Blows all over the place.”

Think about it this way. What could be more random than rolling some dice? But in a perfect, virtual world, we could calculate all the physics and variables, and know what the dice would bring.

We know that’s not realistic. In the real world, it’s just a “toss of the dice.”

We can imagine the same thing for predicting weather. We all know the weather’s hard to predict because it’s a whole bunch of variables and half-understood systems all mashing together to produce results. Or, as the guy said about the weather in The Weather Man, “I don’t know. It’s a guess. It’s wind, man. Blows all over the place.”

The Magic Power(s) of Statistics

So, if we live in this complicated world, how do we make sense of it at all? The long answer is careful analysis and the scientific method, of course. The short answer is often previous experience, “gut” instinct, and a fair amount of bravado.

Is a roll of the dice truly random?

But there’s a stage that falls somewhere in between the two. A basic understanding of the field of statistics helps with the scientific analysis of data (and any other data), but it has two other important advantages.

The first is that it allows us to make clear and rational decisions even with limited data. That’s not because stats makes you a supergenius, but because studying stats gives you new schemas for understanding not just data, but also different types of data.

We make decisions without enough data every day. We “guess.” In many cases, it’s not even that we have to make the best decision, but that making any decision will allow us to progress. After all, in the real world, we’re not wedded to every decision. We can always change our minds later.

The second is that a clear understanding of statistics prevents us from being hoodwinked by massaged data. That’s right! Not only can statistics help us make good decisions, it can also stop us from being fooled into making bad ones.

Once we have some experience with research design, such classics as “seven out of 10 doctors surveyed love this product” means a whole lot less. We begin asking questions like, “how was the sample group selected?” and of course “Which 10 doctors were surveyed?”

In the past, the study of Latin, Greek, and French were necessary for a person to be considered educated. In the information age, a working knowledge of statistics, at least to the point that we know a p-value from a piece of pie, is invaluable.

The Specter of Rationality

One of the challenges of living in modern Western culture is the specter of rationality.

“Specter”? Indeed.

Western culture has (or had, some might argue) an expectation, provided by 19th century notions of progress, of everything in the world being explainable and definable. It was, in fact,  the promise of Science writ large — that given enough time and knowledge, the world would make sense and there would be a single, rational worldview that explained all things.

Thinking Straight

Human brain
There are biological limits to thought.

Rationality is the logical application of previous knowledge to new situations. It allows us to perform analysis, to block out irrelevant information, and to quickly make decisions that are, if not perfect, at least consistent with our experience of the world.

The trouble we face in approaching perfect rationality is that the human mind is only capable of knowing and processing so much.

Rationality depends on access to sufficient data, and no human mind is capable of learning, or even storing, every specific detail of the world. For that reason, we narrow down what information we gather. The fields of knowledge that we, either as individuals or as a culture, choose to study have a huge impact both on our kinds of analysis of data, and on the conclusions we reach.

I’ll try to paraphrase a quote I (probably mis-)remember reading in my teen years:

“Expertise does not cross fields. Unfortunately, this doesn’t stop experts from trying.”

It’s very common for specialists in any one area of knowledge to try to extend their expertise to other fields. For simple and obvious examples, we have Stephen Hawking making declarations on religion, or Pope Paul V’s condemnation of Galileo. However, this kind of messy thinking in the name of rationality happens every day.

Human culture has become incredibly complex, and so for any one person to master multiple areas of study is incredibly rare. Worse, the amount of “new” information being generated every year is growing exponentially.

At the level of mastery, specialization has become even more and more important. Why? Because the human mind has limits. We can only learn, store, and process a certain amount of information. There are basic biological limits to how much information we can store and process.

In order to get around the limitations of the human mind, we have developed tools to help ourselves. Sure, we use computers to store and process information — I’d be lost without Microsoft Excel — but literacy was an earlier and arguably more important invention. But the limits of the human mind still exist.

To put this another way: “knowledge” is often not just a set of information, but specific ways of thinking. For example, accountants think differently from engineers or plumbers, marketers or scholars of religion. Each group is trained to collect and process information differently, and that’s a good thing!

Each way of thinking makes specific neural connections, and channels of thought are created through practice and repetition. While people aren’t limited to only one way of thinking, each skill set takes time and effort to acquire.

We deal with this problem through specialization. An accountant, for example, applies a rational model to the intake, processing, and distribution of funds. This kind of rationality, while extremely useful, is limited in its application. And I’m not picking on accountants. This problem repeats itself in any area of study.

Rationality always depends on assumptions of cause and effect. Those assumptions are influenced by experience, by culture, and by specialization. Perhaps just as importantly, it is always bounded by the limits of data collection, storage, and processing.

It’s impossible for a human to be “perfectly” rational. The good news is that this can help us make sense of the world, know who to listen to about what, and make the best decisions possible.

The bad news is that we’re never going to live in a perfect, rational utopia. We’re going to continue to make decisions in that messy, organic way best represented by “office politics.” But hey, we’re primates anyway. It’ll be fun.

The Collapse of Western Civilization

Primate with globe
The world is a bigger and more diverse place than Western Civilization ever imagined.

We can joke that this event or that  is a herald of the collapse of Western Civilization, but from an outside perspective, it collapsed some time ago. The “end” of Western Civ happened more or less with the end of World War 2 and the following dismantling of the overt colonialism that had in many ways supported Europe and America since the 1490s.

It was not a loss of power that made Western Culture collapse. It was actually the rise of near-instant communication. It was that we could no longer live in an insulated society, knowing with devout certainty that we had the capital-“T” Truth.

One of the hallmarks of Western Civilization has been its belief that there is “one true way” and that it is, and has always been, best reflected by our own culture. In other words, such recent ideas as multiculturalism, ecumenical religion, and racial harmony fly directly in the face of the traditional Western lessons passed on from generation to generation.

Science and Authority

We discovered, in science, a relatively unbiased truth. Now, we could argue that science isn’t perfect in this, but compared to more authoritarian structures of knowing (including anything heavily philosophically influenced by Neoplatonism or monotheism), it’s downright awesome.

Unlike such single-authority philosophies, science can and does tell us things we don’t want to hear. Science, at its best, teaches us how to be authorities, not just how to submit to them. Of course, that means we can get on the wrong track, sometimes. Mistakes happen. When we take science and use it for questionable ends (take eugenics for example), the scientific method eventually gets us back on track.

But Science Isn’t Culture

What ended Western Civilization was a combination of several factors:

  • a belief that there is one truth for all people, times, and places,
  • that we have an obligation to seek out that truth, and
  • that (most key!) our culture should be coterminous with that truth.

When all these things ran up against the stupendous variety of the world, it shook Western Civilization to its foundations. Western culture still exists and even flourishes, but it would be more accurate to think of it as Western “cultures.” We’ve replaced our assumption of a hierarchical system with multiple systems that cooperate and compete.

We have moved our culture closer to science’s view of the world, and decided that if the world is complex and open-ended, then our culture must be, too. This is an experiment probably just as grand as the one called “democracy.” I look forward to watching it as it unfolds.

Rationality Isn’t Rational

Chimpanzee - Leipzig zoo
“Hey, who’re you calling irrational?”

As I’ve discussed before in this blog, our Western image of humans as rational is very much culturally bound. It’s strange, really, that calling someone “irrational” is such a slur. Logic itself, held so widely as a source of strength, shudders under the weight of our everyday lives. Despite our assumptions, values, and best intentions, we act irrationally (without reason) much of the time in daily life.

Why do we say, “Bless you” when someone sneezes? Is the devil really going to fly up their nose? Why do we shake hands? Are we really making certain they’re unarmed?

Are these really “rational” behaviors? No, they’re social rituals, cultural artifacts that don’t make “rational” sense, but have meaning nonetheless. Linguistically, they have pragmatic meaning. Culturally, they have ritual meaning.

When we’re doing what we think are rational things, it’s not usually rationality or logic that is informing our actions. It’s culture. Culture is filled with little rituals that make all the difference in terms of social interaction.

Don’t get me wrong; just because something’s “ritual” that doesn’t make it “untrue.” Rituals — in an anthropological sense — are social conventions, and they allow us to interact by giving us patterns of action that are not only predictable, but allow us to routinize behaviors. Imagine driving on the road if we didn’t all agree that we should drive on the right (or left!). It would work, but it would be inefficient and dangerous.

When we say that someone’s acting “irrationally,” it doesn’t usually mean “illogically” so much as “not in a way that we expect.” That doesn’t make it any less strange, but we might as well call it what it is. “Irrationality” isn’t a logical judgement — it’s a social judgement. We’re placing someone’s behavior against a matrix of cultural expectations and finding it wanting.

In other words, we’re not using “logic” as a process of thinking. We’re using it as a social value. Can logic and rationality really be a dispassionate process and a cultural value at the same time? That’s a tough question, and the answer is probably “no.”

Logic — What It Is, and What It Isn’t

Vulcan CosPlayer
Logic does not transcend cultures.
(CosPlay is another matter.)

If logic were truly a purely rational way of thinking, then it would be spectacularly powerful, and people would study philosophy as avidly and pragmatically as they study computer science or accounting.

Don’t laugh! There was a day, not too long ago, when this was true. When we lived our lives (mostly) in one culture, the ability to think “logically” was much more useful. When our everyday world was less “messy” and overarching values were hierarchically determined, “logic” had more meaning.

The problem with logic is the same problem that exists for all modes of thought. Logic is subject to the GIGO (garbage in, garbage out) principle. Even if we think logically about things,  many of our basic perceptions of the world are based on irrational assumptions and basic drives.

  • I might be rational about buying a car, but that doesn’t mean I’ve rationally examined my need for one.
  • And even if I do, that doesn’t mean I’ve rationally examined all aspects of its impact.
  • And if I decide I don’t need one, then I’ve decided that the environment is more important than any benefit I might derive.

Logic, by itself, doesn’t get us much of anywhere. What’s worse, anthropology — which allows us to look at the world cross-culturally — tells us that pretty much everything we think about is culturally bound.

Contrary to popular belief, logic and rationality don’t transcend culture. Because our assumptions about the world are culturally bound, anything we apply logic to is necessarily bound by these same constraints.

That doesn’t make us helpless, and it doesn’t make logic useless. We can work against our constraints and try to get beyond them. Logic is incredibly useful for making certain that our thoughts all agree with one another. Logic and rationality are powerful tools.

Rationality Isn’t Rational (But It’s Still Pragmatic)

If our cultural beliefs about logic and rationality aren’t everything they’re cracked up to be, then what’s the point of studying? Just about every discipline is about learning to think clearly about certain matters.

Mathematicians learn to think clearly about numbers and data. Chemists learn how atoms, molecules, and compounds act and interact. Lawyers learn to think about law and precedence. Anthropologists learn to think clearly about culture.

Die Chemiker
Understanding the whole of the world is too much for any one person.

Every field of study trains the mind to think in new ways. Being a chemist isn’t a matter of memorizing the periodic table and then going on to learn more and more details of the chemical world through rote memorization. It’s about learning whole new ways of thinking — of forcing neurons to line up and march together to get certain results.

These areas of study are called “disciplines” for a reason. That’s what they are: disciplined ways of thinking about certain aspects of the world.

We might think marketing is crazy, but marketers think far more rationally about the buying and selling habits of people than mathematicians ever will — or they’d be very bad at their jobs. The behaviors that marketers study and try to modify aren’t rational, but that’s a different matter.

People in general aren’t rational about buying and selling. But the people who do are called “investors.” Investing is a discipline with its own rules and assumptions about the world, with its own rationality. Those who study what investors do are called “economists.”

What we can learn from this is that there is no one “rationality” — rationality isn’t just one thing. The world is far too large and diverse for any one of us to know things with perfect rationality, but that doesn’t mean we shouldn’t study it.

It does, however, mean that we should be wary of making purely “rational” decisions. And doubly wary when we’re convinced that we’ve managed to nail down even a piece of the truth.

The Useful Myth of Progress

During an intro to cultural anthropology class, I mentioned (I thought offhandedly) that progress itself is a cultural myth. One of the classes’ best students looked startled and concerned, and raised his hand. “No, it’s not.” he said firmly, and with complete conviction.

It is an incredibly difficult thing for people of any culture to have their most cherished values challenged. And progress is one of the key values of Western culture.

Things Just Keep Getting Better

For a moment, let’s think of Western culture as a book. That book, like books of any genre, has an underlying theme that tells us a lot about who the winners will be, who the losers will be, and what it all means.

For modern Western Culture, the theme could well be “progress.”

The Unknown by John Charles Dollman
“Progress” is the value that allows us to accept that the future is unknown and that new ways of living must be found.

“Progress” is the idea that things just keep getting better and better. It has several corollaries, such as the idea that all “improvements” are somehow inherently good, and that greater efficiency is somehow a good in and of itself.

The schema of “progress” mixes together several ideas in what amounts to a value judgement. It takes the idea of complexity, and attaches it to “the good.” We believe that the complex is somehow better than the simple.

Why Progress?

There’s a good reason that we value progress. The last 10,000 years or more have seen a steady increase in the human population and its population density. Throughout that time, change has been a constant.

From the time of the first cities (around 7,500 BCE), after the advent of farming and the Neolithic revolution, humans have lived in higher population densities than ever before.

Despite being social animals, we primates aren’t given to getting along in large numbers. We have had to develop cultural systems to manage the problems of putting so many territorial primates together. The developments in culture that came with urbanization — stratification, rule of law, etc. — required, and allowed for, ever greater cultural complexity.

With the ever-increasing human population, increased innovation, including complexity in technology and other aspects of culture, becomes necessary to maintain relative peace.

Mythologizing Progress

Painting of Eridu, one of the first cities
Progress supports increased population, which in turn requires new methods of organization and management.

As population density increases, populations have to spend more and more effort not only on resource management, but also on some kind of cultural “research and development.” Otherwise, conflict breaks out as competing groups let loose under the increased pressure. Ever-increasing population density means ever-increasing complexity in human methods of management.

Even such recent inventions as the Internet, seen through this lens, are tools for allowing increasing numbers of primates to somehow manage in a changing and increasingly stressful (in a primate sense) world.

“Progress” is required to maintain some kind of counterbalance to increasing biological pressures. More than that, our culture values “progress” as an idea. We invest in it. That investment is what allows us to maintain and develop the complex cultural systems that keep us relatively fed, healthy, and safe.

The idea that “progress is always good” is tied to the underlying assumption that human population density will always increase. As long as the population keeps going up, progress is a necessity.

Progress — increased complexity and efficiency — is necessary, but that doesn’t make it perfect. The ways that we understand progress are cultural. “Progress is good” is a statement of values, values shaped by specific realities and tied to the underlying assumptions of Western culture. Our ideas of progress determine winners and losers just as much as any war.

“Progress” is one of our bedrock assumptions about the world, and it is a needed one in these times. Yet despite its utility, it is cultural, an idea. Increased complexity is necessary, but that doesn’t mean everything labeled “progress” is inherently good.

Truth Vs. Trend

Spy Vs. SpyThere is often a difference between what “everyone” believes and the truth. Some of social science models of behavior include assumptions of feedback loops that allow us to quickly modify our behavior to be in tune with the reality of situations.

Feedback loops take time, aren’t 100% accurate, and are subject to manipulation. Even pursuing the truth is (for many people) only as useful as long as it serves other goals. And sometimes pursuing a commonly held untruth is more personally profitable.

In other words, the truth isn’t sexy.

Feedback Loops Take Time

When we believe things that aren’t true, and we either share or act on that information, eventually the world will let us know. That’s a feedback loop. But it’s important to recognize that feedback loops aren’t instantaneous.

To start with a simple, even silly example:

A cup of coffee, black
I take my menko black.

Let’s say I wake up one morning believing that coffee’s not called coffee anymore; it’s called menko. I get up and make myself some menko: no feedback since I don’t even read the label.

Next, I’m talking to some people and I mention that my menko is great, and that I really need my menko in the morning. Sure, these people think I’m weird as heck, but they might not say anything. They might just think, “Smile and nod at the strange man waving a coffee cup.

Worse, the people I’m talking to might figure it out from context, and give me feedback, “Yeah, yeah, I need my menko in the morning, too.” Maybe they even think it’s a brand name, or something.

Finally, I head to a coffee chain and ask for a small decaf menko with soy milk. Maybe, just maybe, this kind stranger will let me know that I’m using the wrong word. It’s their business, and it’s in their interest.

Eventually, someone will give me feedback: the word is coffee, not menko. But that takes time, and it takes someone willing to make the effort to correct my mistake.

Some Feedback Loops Are Bigger Than Others

On a larger scale, the social feedback loop is pretty important for checking information. Most people, normal people, don’t do independent research on everything. Instead, they look for social feedback on “facts.” As long as our beliefs are common enough to “work” for our purposes, or at least don’t get in the way of our goals, we’re not going to get feedback to correct them.

Bransfield Iceberg
How about some iced menko?

The menko example is pretty silly, and it’s a situation where negative feedback’s going to come pretty quickly. But what about larger questions, like “Is there such a thing as human-caused climate change?”

Climate change isn’t a topic that really fits well with social feedback. All that will tell us is whether our friends believe that the world is getting warmer. If, instead of relying on specialists (you know, the science-y folks with their labs, samples, and data), we subject the information only to social feedback loops, we’re going to have to wait until things have gotten so bad that nearly everyone has a terrible story of climate change that they want to share.

Most people don’t use scientific data in their everyday feedback loops, relying instead on existing connections–social, political, religious, and so forth–to inform their opinions on topics. In most situations, this doesn’t matter all too much. The data we receive socially is good enough, especially as we primarily act socially, anyway.


Sometimes, the acceptable “facts” are at odds with the actual facts. This happens all the time. Let’s say that we work for a company, PerkyCorp, which prides itself on having an awesome corporate culture. It’s always a party and everyone gets along. But somewhere, hidden in the recesses, something is wrong.

“I’m Cassandra, the Archetype of the Bearer of Bad News, and I approve this message.”

While everyone at PerkyCorp is busy getting along awesomely, and getting promoted for being good team players, decisions might be getting made that are taking the company off the rails of competitive success. What? How?

If everyone’s busy getting along, and getting rewarded for it, there’s no advantage for them to close the feedback loop. Now–there are two sides to this:

  1. Being part of any group, from a  business to a political party, involves a certain amount of “drinking the Kool-Aid.” As long as things don’t get too bad, the social-feedback loop is going to be the primary method of decision making. If the Kool-Aid tastes awful, those who can smile while they drink it will rise to the top.
  2. People who choose to address burgeoning problems do so at their own peril. Making changes in the group, assigning responsibility, and generally being hierarchical is seriously un-fun.

Solving problems that a group doesn’t know they have doesn’t make someone a devoted employee with the interests of the company at heart (even if they are)–it makes them a Cassandra. Being a Cassandra with data and proof isn’t good enough, either. If PerkyCorp can’t see that the feedback loop’s closing around their collective necks, telling them isn’t going to help.

In short, if the PerkyCorp’s GroupThink isn’t telling them that there’s a problem, trying to solve the problem isn’t saving the day, it’s harshing their buzz.