How Do You Know What You Know?

Richard Feynman was a Nobel prize winning physicist. I have enjoyed his thoughts and thinking methods for decades. It is interesting how his thoughts on science have application in every field of study. From psychology to botany to international trade.

One such respects the futility of holding beliefs too long.

“It is impossible to find an answer which someday will not be found to be wrong.”

Using answers effectively.

Many of us have been challenged by how much information there is to have. We have advanced greatly in our ability to access information. I vaguely recall Einstein offering the advice that you should not remember anything you can look up. That may be a little over the line but given the internet, not so far as you might first think.

The advantage of not remembering things is that you won’t become confused when they change from right answer to incomplete or wrong answer.

Unlearning is as important as learning, and strangely, far more difficult.

The necessary step

When you realize that knowing and remembering a lot of things is not so useful today as it might have been in 1900, you will be a little more humble. Using the information is more important than what the information is. Once you lose the idea of knowing a lot of things, you will be free of ego, and possibly useful.

How to think

Thinking is far harder than people believe. In its place, some people have adopted the “received wisdom” model where they ingest the outcome of a thinking process. If they don’t know they are using that process, or they don’t challenge the received wisdom by comparison to other things that could be chosen instead, they will be worse off than before.

Thinking is not optional.

The first step in thinking is understanding evidence.

Received wisdom seldom comes with evidence you can challenge. You may have noticed evidence is never presented in court without the other side challenging how it was collected, its chain of custody, its implication, and its application to the case at bar. There are types. Is it eye-witness evidence or is it physical, or is it circumstantial evidence?

  • Very little eye-witness evidence is reliable. I had a friend who was a police officer at one time, who claimed to mistrust all of history. “I have seen two eye witnesses to the same car crash that shared almost no detail. Not even the colour of the cars.”  It has to do with observation and memory. We do not observe everything, but as we rethink the event, we amend our memory slightly to “perfect” it. It stops being objective.
  • Physical evidence is more reliable. Blood, fingerprints, bullet analysis, DNA, and a hundred more tests generate just one result. The challenge will be about whether the methods used to collect and assess were likely to lead to conclusive results.
  • Circumstantial evidence is the fascinating one. The rule is the evidence must point to the conclusion and it must not reasonably point to any other. Sherlock Holmes opined on this, “”Circumstantial evidence is occasionally very convincing, as when you find a trout in the milk, to quote Thoreau’s example.”

The second step is in analyzing meaning

Drawing meaning from a mass of evidence is difficult and people seldom understand the results presented. Life is complicated and there are few, if any, large tests that will give unambiguous results. Smaller studies may show interesting correlations. People find correlation to be compelling. Scientists find correlations suggestive of a place to study more deeply to discover causation.

Statistical analysis is difficult. It is quite likely there can be more than one interpretation of the meaning found in a large data set. When I was at university a person in my residence was a civil engineer doing a masters degree specializing in traffic management. He had data that fitted a lovely Bell curve, but for a few outliers. Upon further examination, the outliers were the real data and the others were a collection of errors generated by the faulty equipment used in the study. Oops.

If people have preconceived ideas about what will be found, interpretation errors occur.

Most of us cannot reliably analyze data even when available. We can however, learn enough to know how to assess the conclusion presented.

Assessing conclusions instead of data.

This is not so hard.

Look for the p-value. That is the number that will tell you if the conclusion might be reliable. The p-value is the probability that there could be another sample that would generate contradictory assessments. A p-value of .01 means there is one chance in a hundred that this set of data is unreliable. So 99% reliable. You’ll see the reliability referred to in many studies. Political polls for example. A poll with 95% confidence, says candidate A is 45% favoured and Candidate B is 47% favoured. It  will specify a margin of error. Often plus or minus 3%.. The meaning is Candidate A’s is favoured by 42 -48 percent of those surveyed and B by 44-50 percent. The 95% confidence means there is one chance in twenty that the error range is bigger. 95 % confidence (p-value .05) is very high in social assessments, but still means there is a once chance in twenty it is wrong.

Never trust the survey without evidence knowing who was sampled and how

It is not difficult to design a questionnaire to bias results.

Understand the idea of random sampling. Statistical analysis works when the points of data are independent of each other. Without care, a person’s answer may be influenced by others. Taking both spouses might get a different answer than surveying two people in different states.

Sometimes the method biases the result. If I only survey people with a landline I am unlikely to have a random sample of all people, no  matter how carefully I arrange the sample.

In longitudinal studies the data may not reflect conditions unchanged throughout. For example if I take temperature data from a weather station that has been in the same place for 150 years it will likely show gradual warming. In 1870 it was in a farmer’s field a mile from town, now it is alongside a shopping center with a 20 acre asphalt parking lot.

Sometimes the data is adjusted. You would want to know how. Some adjustments make sense, like adjusting for the parking lot’s heat gain.

Is all the data included? You would not have to throw out very many observations to completely change the fabric of the study. Most studies dismiss some of the chosen candidates. You should know why. The process is a reasonable one. Sound studies disclose how they selected.

It comes down to just one thought.

If the presenters cannot or will not disclose their data and their reasoning, they don’t want to. Usually, there is more than one way to interpret it and they like just one of them.

Your approach should be simple.

  1. Dismiss everything if they don’t show their work. You don’t necessarily need to know what to do with the data. The fact they will not disclose is important.
  2. Mistrust experts outside their field. The ultracrepidarian opinion is valueless. (nice word)  As Feynman has said, “A scientist thinking about non-scientific things is just as dumb as the next guy
  3. Be very cautious with journalists, even objective ones. Scientists explain science so the listener can understand it. Parts deemed too complex and relatively less important are not mentioned. Journalists write the story based on their idea of what they heard. The emphasis, often the details, and always the nuance is missed. An alternative viewing point is seldom presented. Stories get complicated if contrary opinion is permitted.
  4. Watch for the idea of what scientists don’t know yet, or what they don’t understand fully. Mistrust certainty. Science is never certain of anything.
  5. Mistrust politicians on matters of science. Politicians know little about science. They rely on the scientist’s interpretation of meaning, presented to an journalist or advisor, who then presents to a politician. Room for misunderstanding. Worse, the politician takes what they want from  presentation. It can bear very little resemblance to the original meaning.

The takeaway

Received wisdom is dangerous.

Be a skeptic. Use your common sense to help you validate what you are shown.

Science is never settled. “Today we say that the law of relativity is supposed to be true at all energies, but someday somebody may come along and say how stupid we were.” Feynman

There is always something else to know. It might be the important thing.

Curiousity is your friend


I help people have more retirement income and larger, more liquid estates.

Call in Canada 705-927-4770, or email don@moneyfyi.com

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: