In case you're new: Thankful Thursdays is a series in which, every Thursday, I write about one thing I'm thankful for. Whether it's something as grand as the time we live in, or as specific about the way , nothing is off limits. Check out my intro post for more on why I'm doing this, and how it might help you too.
In elementary school, there was a moment that broke my heart. It was on par with finding out Santa isn’t real, or that propeller hats don’t actually make you fly. We had a lesson called “Library”, which we always looked forward to, because it mostly involved sitting in bean bags as one of the librarians read us books. As we entered fourth grade, however, it became a bit more practical.
From then on, it became more focused on how to actually use the library for research purposes. Of course, books were a part of that, but this was circa 2006, so we were taught some of the basics of using the internet for research. As much as we hear about fake news today, sifting through bullshit was always an issue, and so one of the greatest focus areas for us was how to tell misinformation from trustworthy sources.
I don’t remember much of what they taught us, but I do remember the heartbreak when, as part of a practice assignment, we were told that an island for dogs to go on vacation was not, in fact, real. I’d recognise the Web 1.0 look and the pictures if I saw it, but I couldn’t tell you for the life of me what that fake site was called.
As much as it hurt, I couldn’t be more thankful I went through that because, even if I may not remember the details, it taught me that you can’t trust everything you read (or hear, or see), no matter how awesome it sounds or how much it agrees with you. I most certainly did not take that advice to heart straight away. I probably ignored it for years, but the foundation was there, ready to be reinforced by two further parts of my education: Theory of Knowledge and my Psychology Bachelors.
Theory of Knowledge (or ToK) was a class that became mandatory for our final two years of school, and will be familiar to anyone else who took the IB (International Baccalaureate), an educational system common at international schools. The core focus was to uncover how we know things, and touched on elements of philosophy and science. Ultimately, the point was to uncover the assumptions we make and to be skeptical.
One of the first examples they used was your car. It’s easy to think “It started every day for the last eight years, which means it will start today”. Of course, we know that cars degrade over time and that a myriad of things can happen to it which will prevent it from working at some stage. It may be likely to start, but it’s far from a definitive bet. Either way, the fact that it started in the past is not a reason for it to start again. Same goes for the sun – it’s risen and set for millions of years, so it’ll obviously keep doing that, right? Well, not forever.
We learned about concepts like inductive and deductive reasoning, but this post isn’t about the details (though I do intend to have a full-blown guide to this stuff at some stage). My point is just that I was taught to be skeptical, and that has tremendous benefits for my life quality, which will become clear as I touch on the final part of my education that reinforced and enhanced this for me:
The very first lecture in my Psychology degree was perhaps the most important. Before we even dove into a single bit of psychology, we were told about the history of the scientific mindset, with much of modern science stemming from Karl Popper’s idea of falsifiability.
In essence, the idea of any scientific experiment is not to try and prove a hypothesis right, but to prove it wrong. It’s easy to find some evidence supporting an idea and call it a day, but if several well-designed attempts to prove an idea wrong fail, it’s that much likelier to be true. This changed the way I look at “new evidence” reported for things, as well as my mindset towards answering my own questions. Often those two go hand in hand:
Previously, if I wanted to eat healthier, I’d do a quick Google search and believe the first news article linking, say, red meat consumption to cancer. This was a big news story back when it came out, and, finding it as part of my research, took it as established fact. Since learning about falsifiability, however, I try and do two things: one, I specifically seek out information that contradicts this and two, and I try and find the original source.
Contradicting information helps you gain perspective on any topic. Perhaps you consider points you hadn’t previously, and the truth is on the opposite end of where you thought it was (or, as is often the case, somewhere in the middle). Other times, by seeing both arguments, you can quickly tell which side is completely made up, but you’d never know that from just reading one side of any debate.
As for the original source, that’s the crux of what I learned over the years. It’s one thing to find the OG source (which Library class taught me), but actually reading the damn thing is a whole other story. Throughout my degree, we learned about good study design, which prepared us to look for signs of good and bad study design in the research papers we read. To look for signs of the researcher/author interpreting beyond what the data alone says, and questions left unanswered. What other factors could explain the results that the researchers didn’t account for?
If I rarely write (or speak) in definite terms, this is where I picked that up. As much as media headlines tend to sensationalise and skew evidence, when you look closely, you’ll notice a lot of “maybe”s and “could”s and “is linked to”s, and not “causes” or “proof” or other such definites. So it’s equally on us to not read what isn’t even written there. This perfectly demonstrates yet another thing that learning about has made me more conscious of: our own psychological biases.
Whether we like it or not, we all have mental habits that, while they make mental life less of an effort, lead to all sorts of misconstructions of reality – one well-known one being confirmation bias, or seeking out information that agrees with us, whether cherry-picking sources or even subconsciously blocking out parts of the same paper that has some pro- and some contra-points to what you believe. While we may never completely get rid of these, I’ve found that learning about them has made me, on occasion, catch myself. Without that self-consciousness, I couldn’t then correct myself and do better.
In addition to written text and our own minds, there’s yet a third source of information we often turn to, but which frequently ends up horribly wrong: other people. In my early dieting days, in search of a quick answer, I’d ask mum or dad about what I should eat, or whether two bananas would make for a nutritious brekkie. Quick answers aren’t necessarily good ones, however: Like a lot of people, they were right some of the time, but not all the time. There’s no way of knowing which times they’re right about without either doing my own research or following their diet advice for decades to see whether it indeed benefits me or kills me. If you’re reading this, mum and dad, I didn’t mean to single you out! It’s just an easy example that a lot of us can relate to. Ask experts of their fields, otherwise don’t take everyone’s word as gospel.
In short, science by its very nature is open-minded and flexible. There’s no such thing as “proof” because that would be final; there’s always the possibility that future experiments reveal the truth to be quite different. I learned early on to remove “proof” from my vernacular, and instead use “evidence”. Evidence is more solid than an opinion – it’s based on empirical data. But it can also one day be shown to be flawed, with better evidence superseding it. It may sound like an effort to be skeptical at first, but once it’s become a mindset, a mental habit, it clears the head of so much stubbornness and misinformation.
Why all this is useful and makes me happy
You might be thinking “big whoop, so you can be right more often than before, but why is this so special?”
A very fair question! I believe this ability has an indirect, but crystal clear relation to my happiness in that the information we find and believe in dictates our behaviour. If we believe, as many do, that fats are bad, we might buy more low-fat or fat-free products, which often replace the fat with sugar. Our opinions of what’s healthy won’t prevent us from getting diabetes, however. In fact, in this case, our opinion is exactly what led us there.
My point being that parsing genuine insights from misinformation or misinterpretations when it comes to how to live a happy life is invaluable. The extra effort is worth it a thousand times over if I end up eating food that gives me the most energy I could feel, sleep well, and exercise in ways that do good, not harm, for my body over the long term. Evidence suggests that money, beyond a certain point, doesn’t make you any happier, so I with that knowledge I can already stop chasing it. Put simply, putting this information to good use makes me feel better in my day-to-day life. If that’s not worth it, I don’t know what is.
And if that’s not enough for you, consider that the information we trust affects not only ourselves, but other people. The pandemic provides a prime example: if we don’t believe the Coronavirus exists, or that it’s entirely harmless, we’ll keep living life as we do, endangering other people in the process by being likelier to catch and then spread the damn thing. Even when that’s over, there’s this whole global warming thing. If you don’t believe it’s caused by human activity, you’ll keep supporting industries that cause the Earth to warm up.
Put simply, not being clear about what’s true from what isn’t is not only bad for you, it’s selfish.
Of course, you still need to take action – knowledge alone won’t fix things. Plenty of people believe in global warming and complain about it while buying food imported from across the globe and driving to a store within perfectly walkable distance. But knowledge is necessary, and without it, doing the right thing would be that much harder.
Too much skepticism?
Another thing I hear many saying (maybe I’m projecting, but I’ve had such thoughts myself) is “but then you wouldn’t believe anything! Wouldn’t it be frustrating and scary to not fully trust anything and just be skeptical of everything?”
Well, it’s not quite that extreme. It’s not so much that I don’t believe anything, it’s more that a) I need a bit more evidence (and of a certain quality) to hold onto a belief, and b) I try not to hold onto that belief too tightly, lest I may one day change that belief in light of new and better evidence.
It’s a mental habit, so like with many new habits, the effort is mostly in the transition. Once it’s up and running, however, the amount of potential it opens up is endless. For one, by having a better idea of what to trust, you can save yourself the exhaustion and frustration of being faced with conflicting information (like what foods are good or bad for you), because you don’t take both at face value, saving yourself the cognitive dissonance. You do take both options seriously, however, until you verify one side (or a third side in the middle).
Above all, however, you can live life and rest assured that, as long as your decision-making is based on the truths you uncover, you’ll be on the right track.
One reply on “Thank you, school, for teaching me to be skeptical”
[…] this blog. What if you encounter two conflicting ideas about how to eat? And you don’t know how to tell misinformation from fact? You could fret about it until the cows come home. Or, you could follow one source’s advice […]