The Economist is asking why people believe in nonsense. Well I will tell you:

 

Carl Sagan, Richard Feynman, and others have known it for years: We need to teach critical thought to young people, but established powers that be are scared of that. Arguments of authority are how you control people, and if you give them tools to peer through that, it destabilizes the status quo.

We want to raise our children well, and teach them how to be powerfully intelligent and functioning, adjusted, people in our society, but we spend their youth lying to them about fat men in chimneys, mischievous rabbits hiding eggs, or secret fairies hoarding teeth. How are they supposed to separate “playful” fiction from staggering lies?

It’s no wonder people believe in this nonsense – we’ve spent the majority of their formative years reinforcing anti-logical processes, which completely negate the ability to debunk nonsense like this.

Hence, relatively intelligent people can’t pick apart conspiracy theories ranging from idiotic bigfoot to even more idiotic & insulting truthers.

We need lesson in how to develop our critical faculties. Period. That is all.

A good start is Carl Sagan’s Baloney Detection Kit:
http://users.tpg.com.au/users/tps-seti/baloney.html

 

The following are suggested as tools for testing arguments and detecting fallacious or fraudulent arguments:

  • Wherever possible there must be independent confirmation of the facts
  • Encourage substantive debate on the evidence by knowledgeable proponents of all points of view.
  • Arguments from authority carry little weight (in science there are no “authorities”).
  • Spin more than one hypothesis – don’t simply run with the first idea that caught your fancy.
  • Try not to get overly attached to a hypothesis just because it’s yours.
  • Quantify, wherever possible.
  • If there is a chain of argument every link in the chain must work.
  • “Occam’s razor” – if there are two hypothesis that explain the data equally well choose the simpler.
  • Ask whether the hypothesis can, at least in principle, be falsified (shown to be false by some unambiguous test). In other words, is isttestable? Can others duplicate the experiment and get the same result?

Additional issues are

  • Conduct control experiments – especially “double blind” experiments where the person taking measurements is not aware of the test and control subjects.
  • Check for confounding factors – separate the variables.

Common fallacies of logic and rhetoric

  • Ad hominem – attacking the arguer and not the argument.
  • Argument from “authority”.
  • Argument from adverse consequences (putting pressure on the decision maker by pointing out dire consequences of an “unfavourable” decision).
  • Appeal to ignorance (absence of evidence is not evidence of absence).
  • Special pleading (typically referring to god’s will).
  • Begging the question (assuming an answer in the way the question is phrased).
  • Observational selection (counting the hits and forgetting the misses).
  • Statistics of small numbers (such as drawing conclusions from inadequate sample sizes).
  • Misunderstanding the nature of statistics (President Eisenhower expressing astonishment and alarm on discovering that fully half of all Americans have below average intelligence!)
  • Inconsistency (e.g. military expenditures based on worst case scenarios but scientific projections on environmental dangers thriftily ignored because they are not “proved”).
  • Non sequitur – “it does not follow” – the logic falls down.
  • Post hoc, ergo propter hoc – “it happened after so it was caused by” – confusion of cause and effect.
  • Meaningless question (“what happens when an irresistible force meets an immovable object?).
  • Excluded middle – considering only the two extremes in a range of possibilities (making the “other side” look worse than it really is).
  • Short-term v. long-term – a subset of excluded middle (“why pursue fundamental science when we have so huge a budget deficit?”).
  • Slippery slope – a subset of excluded middle – unwarranted extrapolation of the effects (give an inch and they will take a mile).
  • Confusion of correlation and causation.
  • Straw man – caricaturing (or stereotyping) a position to make it easier to attack..
  • Suppressed evidence or half-truths.
  • Weasel words – for example, use of euphemisms for war such as “police action” to get around limitations on Presidential powers. “An important art of politicians is to find new names for institutions which under old names have become odious to the public

Above all - read the book!

—-

This should download if you right click and save as a pdf…… the whole book. ENJOY.

You can click and read in a tab as a PDF as well…..

The Demon Haunted World by Carl Sagan

About Uncle Fishbits

I'm.. just this guy, you know?

No Comments

Be the first to start a conversation

Leave a Reply

You must be logged in to post a comment.