One of the challenges of living in modern Western culture is the specter of rationality.
Western culture has (or had, some might argue) an expectation, provided by 19th century notions of progress, of everything in the world being explainable and definable. It was, in fact, the promise of Science writ large — that given enough time and knowledge, the world would make sense and there would be a single, rational worldview that explained all things.
Rationality is the logical application of previous knowledge to new situations. It allows us to perform analysis, to block out irrelevant information, and to quickly make decisions that are, if not perfect, at least consistent with our experience of the world.
The trouble we face in approaching perfect rationality is that the human mind is only capable of knowing and processing so much.
Rationality depends on access to sufficient data, and no human mind is capable of learning, or even storing, every specific detail of the world. For that reason, we narrow down what information we gather. The fields of knowledge that we, either as individuals or as a culture, choose to study have a huge impact both on our kinds of analysis of data, and on the conclusions we reach.
I’ll try to paraphrase a quote I (probably mis-)remember reading in my teen years:
“Expertise does not cross fields. Unfortunately, this doesn’t stop experts from trying.”
It’s very common for specialists in any one area of knowledge to try to extend their expertise to other fields. For simple and obvious examples, we have Stephen Hawking making declarations on religion, or Pope Paul V’s condemnation of Galileo. However, this kind of messy thinking in the name of rationality happens every day.
Human culture has become incredibly complex, and so for any one person to master multiple areas of study is incredibly rare. Worse, the amount of “new” information being generated every year is growing exponentially.
At the level of mastery, specialization has become even more and more important. Why? Because the human mind has limits. We can only learn, store, and process a certain amount of information. There are basic biological limits to how much information we can store and process.
In order to get around the limitations of the human mind, we have developed tools to help ourselves. Sure, we use computers to store and process information — I’d be lost without Microsoft Excel — but literacy was an earlier and arguably more important invention. But the limits of the human mind still exist.
To put this another way: “knowledge” is often not just a set of information, but specific ways of thinking. For example, accountants think differently from engineers or plumbers, marketers or scholars of religion. Each group is trained to collect and process information differently, and that’s a good thing!
Each way of thinking makes specific neural connections, and channels of thought are created through practice and repetition. While people aren’t limited to only one way of thinking, each skill set takes time and effort to acquire.
We deal with this problem through specialization. An accountant, for example, applies a rational model to the intake, processing, and distribution of funds. This kind of rationality, while extremely useful, is limited in its application. And I’m not picking on accountants. This problem repeats itself in any area of study.
Rationality always depends on assumptions of cause and effect. Those assumptions are influenced by experience, by culture, and by specialization. Perhaps just as importantly, it is always bounded by the limits of data collection, storage, and processing.
It’s impossible for a human to be “perfectly” rational. The good news is that this can help us make sense of the world, know who to listen to about what, and make the best decisions possible.
The bad news is that we’re never going to live in a perfect, rational utopia. We’re going to continue to make decisions in that messy, organic way best represented by “office politics.” But hey, we’re primates anyway. It’ll be fun.