Search This Blog

Friday, February 8, 2013

February Book Review 2013

I am a little late on getting this out.  Between travel and sickness I had no energy.  But here it is:

The Believing Brain: From Ghosts and Gods to Politics and Conspiracies---How We Construct Beliefs and Reinforce Them as Truths    by Michael Shermer, 2011   


1. Opening Thoughts
2.  Table of Contents
3.   Further Reviews and Summaries
4. Quotes from the Book

Opening Thoughts

 Shermer’s book is good, but somewhat disturbing.  He relates how as a young man he decided to follow Jesus.  He became part of an evangelical church and set about witnessing to those around him.  But after some experiences that he did not understand and as he delved deeper into the academic world he gave up on the church, Jesus and God.  He came to believe that Christianity was just another religion that was no more or less valuable than any other.  He became an ardent skeptic willingly debating with Christians and others the existence of God.  He went so far as to found the Skeptic Society.  This book he sets about to provide an understanding of how we come to believe in God and other things.  The focus is on beliefs and how our brain comes to believe and then keep those beliefs.

If you are interested in how the brain works, and I think you should be, then this book is helpful.  As we examine our lives we realize that beliefs form the primary operating system for what we say, do and think.  Understanding how beliefs are formed and reinforced helps us become more aware of those things that are driving us.  Beliefs are not and cannot simply be a list.  Beliefs are weaved together into stories that become part of our story.  These stories become the foundation for how we then react to the world around us.  The book becomes a bit wordy as descriptions and antidotal information start to outweigh the actual truths being presented.


Table of Contents
Prologue: I Want to Believe p. 1

Part I Journeys of Belief
1 Mr. D'Arpino's Dilemma p. 11
2 Dr. Collins's Conversion p. 26
3 A Skeptic's Journey p. 37

Part II The Biology of Belief
4 Patternicity p. 59
5 Agenticity p. 87
6 The Believing Neuron p. 111

Part III Belief in Things Unseen
7 Belief in the Afterlife p. 141
8 Belief in God p. 164
9 Belief in Aliens p. 188
10 Belief in Conspiracies p. 207

Part IV Belief in Things Seen
11 Policts of Belief p. 231
12 Confirmations of Belief p. 256
13 Gepgraphies of Belief p. 280
14 Cosmologies of Belief p. 304


Further Reviews and Summaries

Read a fuller book report:  Read Now


Michael Shermer's TED Talk Click to View


Quotes from the Book

The following are some excerpts from the book:

In the cortex of our brains there is a neural network that neuroscientists call the left-hemisphere interpreter. It is, in a manner of speaking, the brain’s storytelling apparatus that reconstructs events into a logical sequence and weaves them together into a meaningful story that makes sense. The process is especially potent when it comes to biography and autobiography: once you know how a life turns out it is easy to go back and reconstruct how one arrived at that particular destination and not some other, and how this journey becomes almost inevitable once the initial conditions and final outcomes are established.

Our brains are belief engines, evolved pattern-recognition machines that connect the dots and create meaning out of the patterns that we think we see in nature.

Top of Form
This process of explaining the mind through the neural activity of the brain makes me a monist. Monists believe that there is just one substance in our head—brain. Dualists, by contrast, believe that there are two substances—brain and mind. This is a very old problem in philosophy dating back to the seventeenth century when the French philosopher RenĂ© Descartes put it on the intellectual landscape, with soul the preferred term of the time (as in “body and soul” instead of “brain and mind”). Broadly speaking, monists assert that body and soul are the same, and that the death of the body—particularly the disintegration of DNA and neurons that store the informational patterns of our bodies, our memories, and our personalities —spells the end of the soul. Dualists contend that body and soul are separate entities, and that the soul continues beyond the existence of the body. Monism is counterintuitive. Dualism is intuitive. It just seems like there is something else inside of us, and our thoughts really do feel like they are floating around up there in our skulls separate from whatever it is our brains are doing.  

Liberal democracy is not just the least bad political system compared to all others (pace Winston Churchill); it is the best system yet devised for giving people a chance to be heard, an opportunity to participate, and a voice to speak truth to power. Market capitalism is the greatest generator of wealth in the history of the world and it has worked everywhere that it has been tried. Combine the two and Idealpolitik may become Realpolitik. *   *   * A final note on belief and truth: To many of my liberal and atheist friends and colleagues, an explanation for religious beliefs such as what I have presented in this book is tantamount to discounting both its internal validity and its external reality. Many of my conservative and theist friends and colleagues take it this way as well and therefore bristle at the thought that explaining a belief explains it away. This is not necessarily so. Explaining why someone believes in democracy does not explain away democracy; explaining why someone who holds liberal or conservative values within a democracy does not explain away those values. In principle, the formation and reinforcement of political, economic, or social beliefs is no different from religious beliefs.

The Confirmation Bias: The Mother of All Cognitive Biases Throughout this book I have referenced the confirmation bias in various contexts. Here I would like to examine it in detail, as it is the mother of all the cognitive biases, giving birth in one form or another to most of the other heuristics. Example: as a fiscal conservative and social liberal I can find common ground whether I am talking to a Republican or a Democrat.

Hindsight Bias In a type of time-reversal confirmation bias, the hindsight bias is the tendency to reconstruct the past to fit with present knowledge. Once an event has occurred, we look back and reconstruct how it happened, why it had to happen that way and not some other way, and why we should have seen it coming all along.

Self-Justification Bias This heuristic is related to the hindsight bias. The self-justification bias is the tendency to rationalize decisions after the fact to convince ourselves that what we did was the best thing we could have done. Once we make a decision about something in our lives we carefully screen subsequent data and filter out all contradictory information related to that decision, leaving only evidence in support of the choice we

Attribution Bias Our beliefs are very much grounded in how we attribute the causal explanations for them, and this leads to a fundamental attribution bias, or the tendency to attribute different causes for our own beliefs and actions than that of others. There are several types of attribution bias.

Sunk-Cost Bias Leo Tolstoy, one of the deepest thinkers on the human condition in the history of literature, made this observation on the power of deeply held and complexly entwined beliefs: “I know that most men, including those at ease with problems of the greatest complexity, can seldom accept even the simplest and most obvious truth if it be such as would oblige them to admit the falsity of conclusions which they have delighted in explaining to colleagues, which they have proudly taught to others, and which they have woven, thread by thread, into the fabric of their lives.” Upton Sinclair said it more succinctly: “It is difficult to get a man to understand something when his job depends on not understanding it.” These observations are examples of the sunk-cost bias, or the tendency to believe in something because of the cost sunk

We tend to prefer existing social, economic, and political arrangements over proposed alternatives, even sometimes at the expense of individual and collective self-interest. Other examples abound.

Endowment Effect The psychology underlying the status quo bias is what economist Richard Thaler calls the endowment effect, or the tendency to value what we own more than what we do not own

Framing Effects How beliefs are framed often determines how they are assessed, and this is called the framing effect, or the tendency to draw different conclusions based on how data are presented. Framing effects are especially noticeable in financial decisions and economic

Anchoring Bias Lacking some objective standard to evaluate beliefs and decisions—which is usually not available—we grasp for any standard on hand, no matter how seemingly subjective. Such standards are called anchors, and this creates the anchoring effect, or the tendency to rely too heavily on a past reference or on one piece of information when making decisions

Availability Heuristic Have you ever noticed how many red lights you encounter while driving when you are late for an appointment? Me, too. How does the universe know that I left late? It doesn’t, of course, but the fact that most of us notice more red lights when we are running late is an example of the availability heuristic, or the tendency to assign probabilities of potential outcomes based

Representative Bias Related to the availability bias is the representative bias, which, as described by its discoverers, psychologists Amos Tversky and Daniel Kahneman, means: “an event is judged probable to the extent that it represents the essential features of its parent population or generating process.” And, more generally, “when faced with the difficult task of judging probability or frequency, people employ a limited number of heuristics which reduce these judgments to simpler ones.”

Inattentional Blindness Bias Arguably one of the most powerful of the cognitive biases that shape our beliefs is captured in the biblical proverb “There are none so blind as those who will not see.” Psychologists call this inattentional blindness, or the tendency to miss something obvious and general while attending to something special and specific. The now-classic experiment in this bias has subjects watching a one-minute video of two teams of three players each, one team donning white shirts and the other black shirts, as they move about one another in a small room tossing two basketballs back and

Our beliefs are buffeted by a host of these and additional cognitive biases that I will briefly mention here (in alphabetical order): Authority bias: the tendency to value the opinions of an authority, especially in the evaluation of something we know little about. Bandwagon effect: the tendency to hold beliefs that other people in your social group hold because of the social reinforcement provided. Barnum effect: the tendency to treat vague and general descriptions of personality as highly accurate and specific. Believability bias: the tendency to evaluate the strength of an argument based on the believability of its conclusion. Clustering illusion: the tendency to see clusters of patterns that, in fact, can be the result of randomness; a form of patternicity. Confabulation bias: the tendency to conflate memories with imagination and other people’s accounts as one’s own. Consistency bias: the tendency to recall one’s past beliefs, attitudes, and behaviors as resembling present beliefs, attitudes, and behaviors more than they actually do. Expectation bias / experimenter bias: the tendency for observers and especially for scientific experimenters to notice, select, and publish data that agree with their expectations for the outcome of an experiment, and to not notice, discard, or disbelieve data that appear to conflict with those experimental expectations. False-consensus effect: the tendency for people to overestimate the degree to which others agree with their beliefs or that will go along with them in a behavior. Halo effect: the tendency for people to generalize one positive trait of a person to all the other traits of that person. Herd bias: the tendency to adopt the beliefs and follow the behaviors of the majority of members in a group in order to avoid conflict. Illusion of control: the tendency for people to believe that they can control or at least influence outcomes that most people cannot control or influence. Illusory correlation: the tendency to assume that a causal connection (correlation) exists between two variables; another form of patternicity. In-group bias: the tendency for people to value the beliefs and attitudes of those whom they perceive to be fellow members of their group, and to discount the beliefs and attitudes of those whom they perceive to be members of a different group. Just-world bias: the tendency for people to search for things that the victim of an unfortunate event might have done to deserve it. Negativity bias: the tendency to pay closer attention and give more weight to negative events, beliefs, and information than to positive. Normalcy bias: the tendency to discount the possibility of a disaster that has never happened before. Not-invented-here bias: the tendency to discount the value of a belief or source of information that does not come from within. Primacy effect: the tendency to notice, remember, and assess as more valuable initial events more than subsequent

Projection bias: the tendency to assume that others share the same or similar beliefs, attitudes, and values, and to overestimate the probability of others’ behaviors based on our own behaviors. Recency effect: the tendency to notice, remember, and assess as more valuable recent events more than earlier events. Rosy retrospection bias: the tendency to remember past events as being more positive than they actually were. Self-fulfilling prophecy: the tendency to believe in ideas and to behave in ways that conform to expectations for beliefs and actions. Stereotyping or generalization bias: the tendency to assume that a member of a group will have certain characteristics believed to represent the group without having actual information about that particular member. Trait-ascription bias: the tendency for people to assess their own personality, behavior, and beliefs as more variable and less dogmatic 
Top of Form