Skip to content
Paperback How We Know What Isn't So Book

ISBN: 0029117062

ISBN13: 9780029117064

How We Know What Isn't So

Select Format

Select Condition ThriftBooks Help Icon

Recommended

Format: Paperback

Condition: Very Good

$4.89
Save $15.11!
List Price $20.00
Almost Gone, Only 1 Left!

Book Overview

Thomas Gilovich offers a wise and readable guide to the fallacy of the obvious in everyday life.When can we trust what we believe--that "teams and players have winning streaks," that "flattery works," or that "the more people who agree, the more likely they are to be right"--and when are such beliefs suspect? Thomas Gilovich offers a guide to the fallacy of the obvious in everyday life. Illustrating his points with examples, and supporting them with...

Customer Reviews

5 ratings

Beliefs and Bias

Mr. Gilovich says ". . . there are inherent biases in the data upon which we base our beliefs, biases that must be recognized and overcome if we are to arrive at sound judgments and valid beliefs." The cost of these biases is real and severe. This book explains why people are prone to wrong thinking, and ways they can counteract this. Here are points that Mr. Gilovich made: 1. Seeing Order in Randomness - We all have a natural tendency to see order in data, even when the data is totally random and irregular. We do this even when we have no personal reason to see order. This happens especially when we remember facts from the past. Our memory plays tricks on us by emphasizing any possible patterns, and forgetting irregularities that might refute the patterns. For instance, basketball players often think that if they make one successful basket, then they are more likely to make the next basket - because they remember times when this has happened to them. "When you're hot, you're hot." However, objective statistical studies done on when successful baskets are made show that, if anything, the opposite is true. This natural tendency to misconstrue random events is called the "clustering illusion." Chance events often seem to us to have some order to them, but when the law of averages is applied objectively, this order disappears. This error is compounded when our active imagination tries to create theories for why there should be order. Because of this, we need to be careful when we draw conclusions based on a sequence we think we see in some data. 2. Looking for Confirmation - We all have a natural tendency to look for "yes" instead of "no." If we have an idea, we tend to look for evidence that will confirm our idea, not evidence that will disprove it. This is true even when we have no personal attachment to the idea. Some researchers believe this tendency results from our need to take an extra neurological step when we try to understand negative or disconfirming evidence, in contrast to positive or confirming evidence. To understand a negative proposition, we may need to translate it into a positive one. Therefore, we subconsciously look for easy positives instead of more difficult negatives. This does not promote objectivity and good science. If we want to do good science, then we need to force ourselves to look for negative evidence that contradict our ideas. 3. Hidden Data - When we search for evidence, often there is data that we unintentionally overlook. For instance, if we receive a bad impression about a person from the beginning, we may avoid them, and by avoiding them, they may never have a chance to show us the better side of their personality. But if we receive a good impression, we may get to know that person better, and thereby gather more positive data, and falsely confirm in our mind that first impressions are reliable. The way we collect data may filter out important categories of data, and this may cause us to confirm our

Sloppy thinking has its price

A well-written book that focuses on the common errors human beings make when trying to comprehend the world around them, and form opinions. Central ponits: that we try to make order out of chaos, even when there is no order; that we filter what we hear according to our own biases; and that wishful thinking can distort reality. He sets up the cases in a very readable way, and then gives examples of a few erroneous beliefs and their consequences. This is where you may find some disagreeing with him. The case studies include ESP and Homeopathy. If you subscribe to those fallacies, you will probably be challenged during that section of the book. Since there are NO reputable studies that support them, that is to the good. Finally, he gives us a clue into how we can better evaluate what information we are presented with. While not a scholarly work on "Critical Thinking" (such as "Asking the Right Questions: A Guide to Critical Thinking") it would be a wonderful companion book to "Why People Believe Weird Things: Pseudoscience, Superstition, and Other Confusions of Our Time" by Shermer and Gould. You owe it to yourself to read this and consider it fairly.

Should be required reading in every college

This book teaches one of the most important lessons anyone can learn: that we all make mistakes.Most of us overestimate the frequency of events which receive wide media coverage, like plane crashes. We strain to find significance in random data, and believe things we want to believe even if there's no evidence to support them. "How We Know What Isn't So" explains why, and shows us how to overcome the factors which produce such systematic error. Until very recently, many of the most egregiously false claims never reached a broad audience. Now that the Web allows anyone with rudimentary skills to create an impressive-looking, authoritative-sounding Web site, the lessons of this book are more important than ever.

A-ha!

The Sports Illustrated curse is NOT real. Our gut feelings about winning streaks and losing streaks are way off. And there's sort of an illusion that makes punishment look more effective than it probably is and reward look less effective than it probably is (reward has a tougher row to hoe, in fact). These are among Gilovich's more memorable points. Each is backed up with plain reasoning AND hard data. It's just the kind of book that'll make you THINK about what you're thinking. An excellent start down that path, one we all need to take. I enjoyed it and got a lot out of it. I have re-read parts of it a few times in the years since I first bought it.Written by a social psychologist for a lay audience. It's well organized and easily digestible as long as you are willing to stop and think every so often as you read.I'd like to see this book handed out to every new college student, or, maybe better, required reading for every high school student.

An easy-to-read primer on faulty reasoning

This book's strength lies in Gilovich's ability to make science, statistics, and the tools of critical thinking accessible to anyone. Armed with the examples and reasoning of this excellent work, anyone would be able to stand up to the most fervent proponents of bogus "phenomena" like certain alternative therapies and the easy lure of seeing "extra-sensory" connections where none exist. Most importantly, Gilovich is able to explain in simple language why the average reader should be wary of anecdotal evidence, and should not fail to look at such "evidence" in its overall context. In other words, the book brings home the importance of the scientific method and tenaciously holds to that standard. Interestingly, in the case of the smallest bit of empirical evidence for ESP ("Ganzfeld" experiments), the book recites the data without bias against such phenomena. Instead, as is his way, Gilovich simply follows the data where it leads. The author should rank in the same league as Steven Jay Gould and Carl Sagan in terms of bringing science to the lay reader.
Copyright © 2023 Thriftbooks.com Terms of Use | Privacy Policy | Do Not Sell/Share My Personal Information | Cookie Policy | Cookie Preferences | Accessibility Statement
ThriftBooks® and the ThriftBooks® logo are registered trademarks of Thrift Books Global, LLC
GoDaddy Verified and Secured