Good afternoon ladies and gentlemen, welcome to this week's SDA distinguished lecture, documentary & interview symposium. This week, for your delectation, here is Michael Shermer, director of the Skeptics Society, presenting his TED Talk on Why People Believe Strange Things ¤ (14:11), in which he argues that in the absence of sound science, incomplete information can combine with the power of suggestion to produce false results.
In fact, Michael says, we humans tend to convince ourselves to believe: we overvalue the "hits" that support our beliefs, and discount the more numerous "misses". And please note, before anyone flies off the handle, Michael does not believe that science can invalidate religion. He says so. Rather, his central issue in today's lecture, and the central topic in our symposium here today, is: confirmation bias. Lastly, before anyone thinks that I am claiming that I don't suffer from confirmation bias, to be clear: it's one of the things I'm most worried about doing (to the degree that I worry about anything at all).
"Michael does not believe that science can invalidate religion"
That is good, because one can not invalidate the other. Science belongs under the logic, critical thinking columns.
Whereas religion belongs under the faith column.
So to apply one to the other would not make any sense.
Posted by: allan at October 3, 2009 5:38 PMNice flight, Allan ;-) I agree, though. In fact, you will find if you bother to watch Michael's talk and then join us in today's symposium that Michael does actually address that. Indeed, as Pope Benedict XVI noted in his Regensberg Lecture:
"Modern scientific reason quite simply has to accept the rational structure of matter and the correspondence between our spirit and the prevailing rational structures of nature as a given, on which its methodology has to be based. Yet the question why this has to be so is a real question, and one which has to be remanded by the natural sciences to other modes and planes of thought: to philosophy and theology."
Which means that now that we've gotten that out of
the way, we can reasonably return to the central topic
of today's symposium: human confirmation bias.
I will get to it probably later tonight, supper is being cooked and I have to put the speaker system back together.
Posted by: allan at October 3, 2009 6:32 PMCool, Allan. God willing I'll be back tomorrow too. And I would suggest reading the Wikipedia page on human confirmation bias. It's an interesting problem that effects the degree to which each and every one of us is correct or wrong. Also, double-check the phase polarity on your speakers, tonight's Late Nite Radio show needs the bass ;-)
Posted by: Vitruvius at October 3, 2009 6:37 PMIt was difficult for me to think of "confirmation bias" without thinking of global warming, such is our bombardment by this topic. It has become almost impossible to enter a store and not see a mention of how green we're becoming (so buy our products...) or how we need to do more for our environment (so buy our products...). I haven't looked at their website yet, but something tells me I'm going to have my suspicions confirmed...
an enjoyable lecture, well worth the 14 minutes. I have a lot of respect for science, even more so when mention is made or alluded to that faith is separate from reason... that never should the two be confused, blended or concocted for someone's means to an end. (and my faith, is on solid ground... heh).
Posted by: marc in calgary at October 3, 2009 6:40 PMBut even confirmation bias on global warming goes both ways, Marc. Hands up: how many people here felt good when McIntyre schmacked Biffra on his results? I know I did. And that's an example of confirmation bias. It's how our brains work. And for good evolutionary reasons, too.
Yet the question remains: what do we do about those of its effects that are negative? How do we avoid our natural sense of righteousness resulting in wrongousness ~ overwhelming our sense of judgement to effect the result of our being actually wrong? I would argue that there are tools we can use, such as skepticism and empiricism, that can help, although even those can be abused to produce cynical pessimistic sourpusses.
Posted by: Vitruvius at October 3, 2009 6:51 PMVitruvius, sounds like a subtle and winking "QED"
Posted by: Erik Larsen at October 3, 2009 7:05 PMUm, what "sounds like a subtle and winking 'QED'", Erik? That AGW cognitive bias goes both ways? Well, if that's the case, so it is with Obama, Ignatieff, the CBC, indeed, most topics that are highly discussed in the blogosphere. And that's at least in part because the blogosphere is a nearly perfect cognitive bias positive feedback amplifier, at least for those participants who don't actively seek out to consider opinions that are biased in opposition to their own biases.
Posted by: Vitruvius at October 3, 2009 7:16 PMHe does mention on his video of "The Baloney Detection Kit" at about the 8 or 9 minute mark that the science is starting to lean to the direction of the existence of global warming.
But leaning, is a far cry from the political notion that we throw all our babies out with the bath water, the process I seem to be up against in every retail shop in the land with their new light bulbs and green this and that. It's madness. It is to think that everything I know is wrong, and for a political agenda?
So for the time being, I remain terribly skeptical on that topic. Leaning, isn't enough for me to buy those goods.
When will I believe it? when Al Gore sends his son in a rocket to another planet in order to save humanity from a blazing inferno.
Thanks Vitruvius
Posted by: marc in calgary at October 3, 2009 7:20 PMScience doesn't lean, Marc, and AGW doesn't enter into it. Science is an epistemological methodology founded in empiricism. If AGW is doing that, it's science, otherwise it is not, no matter what it predicts.
Human bias, on the other hand, leans. Indeed, "how one leans" is probably a good definition of human bias. Nevertheless, I think it's also probably a matter of neurological hardware bias (especially after last week's brain plasticity symposium) at least in addition to a purely ad hoc software decision bias.
One might put it this way: we tend to see the things we expect to see, to the degree that we can at least in theory sometimes be biased even to the point of hallucination (when our expectations are enough out of whack with reality). Seeing what's really there, especially when we have an overriding preconception that is false (such as when we are hallucinating) is a difficult problem. Or in the vernacular: how can one prove one to oneself that one is not hallucinating?
I think that the way to mitigate the human bias problem (to the degree that it is a problem, I mean, de gustibus non disputandum est is not necessarily a problem) lies in the direction of the answer to the last question of my previous paragraph.
Posted by: Vitruvius at October 3, 2009 7:55 PMSorry Vitruvius, I thought you were calling yourself a cynical pessimistic sourpuss. :-)
I agree with the echo chamber-ness of the internet blogosphere. I've discussed "hot topics" with people such as the Acorn issue, and all I get is blank stares.
Posted by: Erik Larsen at October 3, 2009 8:15 PMIt was Michael Shermer who used the words, "it appears" that global warming is real, I heard "leaning"... well, because I'm human, with bias.
Posted by: marc in calgary at October 3, 2009 8:18 PMGod (yes, God) you two; get a life!
Posted by: Larry Bennett at October 3, 2009 8:25 PMWell, Erik, I admit that I'm not a very good writer, so sometimes it may not sound like it here, but in real life I'm the antithesis of a cynical pessimistic sourpuss. Meanwhile, enough about me, why, Erik, and with whom and under what conditions, did you discuss hot topics that got you only blank stares? What techniques, such as (if you agree with me) skepticism and empiricism, did you use to convince yourself that the blank stares you were receiving were a result of their cognitive bias, and not a result of your own?
Posted by: Vitruvius at October 3, 2009 8:25 PMhahaha Larry Bennett. It's all good to get your science de-skewed and to take a second look at your belief system now and again.
*Marc leans into it to adjust his stereophonic de-skewing device for what's promised to be a bass test at SDA's Late Nite Radio...*
Posted by: marc in calgary at October 3, 2009 8:40 PMGet a life? My dear Larry ~ some of us, with Kate's permission, are having a symposium here, deliberately and on purpose, and that symposium has a range of topics that are delineated at the beginning of the symposium, by the symposiarch (who happens to be me, for the DLDI symposia). If you don't like that then you are welcome to participate in other entries at this site, however you are not welcome to participate in other topics in this symposium. Do I make myself clear?
Posted by: Vitruvius at October 3, 2009 8:43 PMPoint of order, Marc: I didn't say bass test, I said needs ;-)
Posted by: Vitruvius at October 3, 2009 8:53 PMThat's just it - you don't. That may be my failing or yours, either way, enjoy!
Posted by: Larry Bennett at October 3, 2009 8:55 PMVitruvius, I discuss these topics with my fellow co-workers.
I don't have enough data to assess my cognitive bias, as I talk politics regularly to only half a dozen people, all of whom don't share my personal politics.
But I would have liked to have expanded my survey to 200 randomly selected people, and ask them what the US organization ACORN is, what it does, and if it has been in the news lately. It would be also interesting to have someone administer me a set of questions, so I could assess my cognitive bias based on an external set of data.
My problem is that I have no idea where I am on the left/right spectrum compared to "the average Canadian". I just don't have the data.
Posted by: Erik Larsen at October 3, 2009 8:55 PMSorry I wasn't more clear, Erik, my question wasn't whether or not they were interested in ACORN, my question was why you (and I) should assume that the fact that they are not is indicitive of their bias, rather than of ours. On what grounds do we get to claim that what we are interested in or believe in, be it you Erik, or Larry, or me, is more important or correct than what they believe in, other than, well, let's see, so far in this discussion we have three candidates: (1) science, (2) bias, and (3) authority.
Posted by: Vitruvius at October 3, 2009 9:05 PMWhen I talk about myself and how I fit into the world, I know the numerator (one of course). What I don't know is the denominator for different data subsets.
Back to the above, I was surprised, as I thought around 1/5 people that I talk with would know about ACORN.
Authority - that gets to sample size, and how does one weight cultural background into each survey of one's views to assess bias, etc etc.
I'm more than three standard deviations off the mean for some things, and right in the middle for others. (based on my own assessment of the statistics, though).
Posted by: Erik Larsen at October 3, 2009 9:14 PMSure, me too on the sigmas Erik, although on the authority side I meant authority by fiat, such as being the symposiarch, or the DJ of a radio show (in which case I get to pick the tunes unless and until I'm fired). But why were you looking for, expecting, that rate of return on others' knowledge of, attention to, and concern for ACORN, if it wasn't to confirm your bias as to whether or not ACORN is or should be that important to them? And if it's not about ACORN, then what about Obama, or global warming, or the Chinese? You see, it's not any specific example that is relevent to the discussion of the phenomenon of confirmation bias.
Posted by: Vitruvius at October 3, 2009 9:25 PMvitruvius - you refer to three candidates for what can be termed 'Fixation of Belief' (Peirce's phrase): science, bias, authority.
I'll refer to Peirce's 1877 article of that title; he outlines four methods or 'guiding principles' for 'coming to a 'held conclusion'.
There's 'tenacity', which is an emotional adherence to X, simply because, tautologically, you have adhered to X and that adherence is its own reason for adhering....and so on. It works best if you are a hermit or a fanatic. In both cases, you have lost any individual capacity to observe and think.
There's, of course, 'authority', which has always been with us and will remain with us, given that our particular species must learn its knowledge..from some Elder or Wise Person(s), and must accept this knowledge as having some validity and reliability. This dominance of authority over individual reasoning and experience, while inevitable, can and does, enable totalitarianism and hierarchical enslavement.
Then, there's what Peirce defines as the 'a priori' method, which is the act of coming to a belief by whim, by caprice, by 'taste or choice'. The focus is not on observed facts, much less analysis, but because the beliefs seem 'agreeable for me to believe in'. I'd say that this mode of coming to an opinion is one of the most common. How many times have we heard that So and So is...whatever..because 'I feel it in my bones'.
And we can see it in the phrase 'it makes sense to me'. An example would be Maureen Dowd's switching of Wilson's call-out to President Obama: You lie...to 'You lie, boy". The addition of the derogatory 'boy' ..made sense to her, even if it wasn't said. It's an emotionally satisfying method of coming to a belief. Bias could fit in here, as it could fit with tenacity.
And finally, there is the scientific method, which relies on both observation and reasoning, because it acknowledges that 'there are Real things, whose characters are entirely independnt of our opinions about them"..and we can understand this reality by the use of both objective observation and reason.
As Peirce said, 'It is impossible not to envy the man who can dismiss reason, although we know how it must turn out at last". Yes, 'the other methods of settling opinion, over scientific investigation' are indeed slick, easy and can be quite satisfying. Until their fictions meet reality. That's the problem...fact versus fiction.
Posted by: ET at October 3, 2009 10:04 PM
Vitruvius, I was neither looking for, nor expecting a rate of return on others' knowledge, nor was I acting to confirm my bias.
I was just surprised that I had miscalculated how my friends thought.
But I guess I did expect them to conform to how I had preconceived their thinking to be.
Posted by: Erik Larsen at October 3, 2009 10:38 PMPeirce's 1877 characterizations on Fixation of Belief are excellent in their own right, ET. Nevertheless, I do find the sorts of categories in the Wikipedia article on confirmation bias interesting too; things like biased hypothesis- testing, assimilation bias, attitude polarization, the primacy effect, selective memory, and belief perseverance.
What I find most interesting though is that we humans exhibit this here what them there scientists call confirmation bias phenomenon even with purely abstract randomized studies using things like red and black checkers randomly selected from a bowl. Yes I am serious; some of the tests are described in the Wikipedia article, and I don't think they will surprise anyone when seen in that abstract way.
It's when we concretize the concepts that we are likely to fail. For example, some people argue that Oh, no, I don't have to worry about confirmation bias, because my argument is correct ~ I have accumulated a set of stories that prove it. Perhaps they are correct. I like to think that I am correct. But that is not relevant to the phenomenon of confirmation bias as it is understood in the modern scientific sense.
Posted by: Vitruvius at October 3, 2009 10:44 PMThanks Vitruvius,that was time well spent.
Posted by: wallyj at October 3, 2009 11:12 PMUral was unable to post a comment here last night, so he posted over in Reader Tips instead. Now it's not for me to speak for Ural, and he can correct me if I'm wrong, but my understanding is that his arguments are that I was exhibiting confirmation bias in the DLDI symposium we previous held on the subject of dowsing, and that I do not allow people to disagree in these symposia.
Well, I have stated above that I do not think that I am immune to confirmation bias. However, in the case of the previous dowsing show, people certainly disagreed with me, rather it was my point that there was a scientific test in scope in that symposium, so unless they could provide some evidence, some link or study, that provided opposing evidence, then they couldn't go on an on claiming that their personal experience invalidated the study being discussed, because the topic of the symposium was that study.
And so it is in today's symposium. Because Dr. Shermer mentions in his talk some examples that some people might disagree with, we know from experience that some people will try to hijack today's symposium to discuss what they want to, rather than discussing Dr. Shermer's talk and the subject thereof, confirmation bias, and the evidence he presents thereto. Past some point, that's not allowed according to the "Before submitting, review the post to ensure your comment is on topic..." rule stated in cyan below.
I would love to hear from people on the topic(s) of today's symposium. After all, that's why I go to all the trouble to listen to hours and hours of talks and lectures to find the few that both I find interesting and that I think and hope that SDA listeners will find interesting as the topics for symposia. And indeed some do, as they have stated here and elsewhere, so perhaps all is not lost and it is not a bad idea to keep on doing these symposia every week or three.
Meanwhile, does anyone think that confirmation bias is not a problem, or that it does not apply to them? Does anyone have any suggestions on how to avoid confirmation bias and to mitigate the problem, in addition to the ones I have suggested above? Does anyone have any other scientific evidence that compares with or contrasts the evidence that Michael exhibited, in his talk, purporting to show confirmation bias?
Posted by: Vitruvius at October 4, 2009 11:21 AM[quote]Meanwhile, does anyone think that confirmation bias is not a problem[/quote]vitruvius
First I must admit I am unable to pull real-time through a dialup and obtain a reasonable understanding of the content. I will be back to a cable modem at end of month
The Confirmation bias link..
No I do not think Confirmation Bias is a problem unless it becomes a tool used for Social Engineering Manipulation
It is my opinion that real(Good) Science must follow a path that is open to Confirmation Bias, along with the risk of the Ego that your science may be false...
Posted by: Phillip G. Shaw at October 4, 2009 12:31 PMFair enough, Phillip, in which case the question becomes: is confirmation bias a tool that is used for manipulation (social engineering or otherwise), and if so, by whom? Is it only by them, and never by us? Do we ever do it to ourselves? And if so, should we try to avoid behaviour like that?
Posted by: Vitruvius at October 4, 2009 12:40 PMI think that 'confirmation bias' is a basic charcteristic of our species. This is because our knowledge is not, primarily, genetic but learned.
And, as learned, this knowledge is stored, gathered, and communicated in 'symbols'. So, we learn the 'symbol' of 'cat' for that feline animal. But, that symbol doesn't stand in a singular relationship with the feline but increases its information content over time and place. So, 'you're a cat' when applied to a human means something different from telling your pet animal..what it is.
So, if we accept as fundamental that our information operates within man-made symbolic 'clothing', then this sets up the scenario for multiple interpretations.
The example in Wikipedia of the Set of numbers (2,4,6) reveals this multiplicity of possible interpretations. Since the Rule of their relation was not stated but was open to interpretation, then, obviously, multiple interpretants could be derived from this simple Set.
A further result of our unique method of gathering and storing and communicating information (via symbols)..is that it is done by individuals. We don't act 'as a group' in this behaviour; we aren't ants or bees. So, each individual carries out these various tasks with this various information..and this leads to an exponential variety of possible interpretations. This certainly is an adaptive feature; there is no other species so capable of changing his means of interaction with the envt..as our species. Can't fly? Don't wait for a pair of wings to grow. Invent the airplane. But, these innovations are due to that unique nature of information processing..which is in the individual. This enables deviation and dissent-from-the norm. That's a great strength..and problem.
And, it also illustrates that it may be very difficult to move from one Rule to another Rule, even if the former is incorrect. That's because a Rule of Interpretation, which acts as a General Mediation for interpreting incoming particular data..is..well, it's general. It serves to cover a multitude of different particulars. So, Vitruvius can provide us with different particular examples of cheese..but..at the mediate level..it's 'all cheese'. ( I don't think I should say that...).
So, if our mediate rule of The Cause Of Disease is 'displeasing the gods'...well, it can be very difficult to move from this quite satisfying Rule(because it IS an explanation if not a solution)..to one that sets up Germs as the mediate rule. It takes a very loud individual to convince others that well, maybe, the gods have less involvement in diseases than germs, and maybe, we might just be able to control or at least, rein in, those germs.
Bias in survey questions is well-known and quite frankly, plays a huge role in the unreliability of many surveys. Words mean different things to different people, and different things to the same people, if those words are used in different areas of the sentence...etc.
It's almost impossible, indeed, I'd remove the 'almost' and declare that it's impossible to remove this 'confirmation bias' from our mode of thinking. It's a concomitant effect of that very mode of thinking..which is in symbols..which are open to interpretation.
How do we deal with our strength and weakness, which are found hand in hand in the same capacity to think? With vigilance. That's about all. Vigilance. And humility. We might, just might, not be right.
Posted by: ET at October 4, 2009 1:04 PMI generally agree, ET. In particular, as I mentioned earlier in this discussion, bias of the form "I like cheese" plus "this is cheese" producing "I like this" is not a problem for those simple matters of taste. Now consider an arguably more significant problem, for example, the case of the primacy effect documented in the Wikipedia article.
Say I'm going to buy a product in some particular class, and because I've studied confirmation bias I know that I will tend to prefer the first product I try, even if one or more tried later would have been preferable if it were not for the primacy effect. Does my knowledge of innate human biases of these forms not then allow me, if consciously applied, to tend to do a better job of selecting the product that I would otherwise find to be the best purchase for me?
And if so, what happens when it comes to "buying" an argument, or "buying into" a political party? And, again if so, what does that say about the confirmation bias amplification effect exhibited by self-selected blogosphere echo chambers: to what degree does that amplification effect contribute to blogosphere pathology, and to what degree does it or could it pose problems for how society would function compared to if that were not the case?
Are we being vigilant in our civic duty if we only visit web sites where everyone agrees with us or at least our general side (thus satisfying our craving for confirmation bias)? Does interpreting every example of every questionable move by everyone on the other side as evidence of their wrongness, again confirming our bias, increase the quality of our society, or decrease it?
Posted by: Vitruvius at October 4, 2009 2:58 PMWell, vitruvius, in matters of personal choice, I suppose one has to acknowledge frailty, and admit, possibly, that "I'm someone who makes an immediate first choice..and even though I dither, I always come back to...". Or, even, "I'm someone who always rejects the first choice"..or..
In matters of broader choice, such as political issues, one has to, I hope, rely on reason and objective facts. That is, I hope that I select my choice of a conservative style of governance (small govt, private enterprise, decentralization, etc) on the basis of thought as applied to evidence, both current and historical. And after making that choice, it does indeed become a mediating habit and I go to, primarily, only those sites that support that outlook.
And is my interpretation of many of the moves of The Other Side...based only on my 'confirmation bias', or, can it be due to Reason? For example, Ignatieff's open declaration that He and the Liberals will no longer support any Conservative motion is most certainly 'confirmation bias'. As the Loyal Opposition, their judgment to support or reject ought to be made on observational evidence and reason..i.e..they ought to read the D** bills first and then decide to support or not support.
With regard to truth, which is to say, objective truth about reality, ..for that, I think we have to rely on what Peirce termed 'The Community of Scholars'. No single individual can arrive at the truth by himself. It requires a community who observe, think and analyze.
For example, I know, and yes, some others know, that the earth is not really a giant ball in the sky. It's a flat disc, twirling on its axis..and that rapid twirling is what gives the illusion of it being a ball-shape. Heh. We Disc-Believers know better, but, the Community of Scholars says that the earth is indeed that balloon ball in the sky..and my community of Disc-Believers..is rather small, and so...
Posted by: ET at October 4, 2009 3:53 PMIndependent of whether or not a choice is personal or broader, ET, I think that it is also possible that one could learn to be a better reasoner by learning that after reviewing a set of potential selections, and having made a provisional choice, one should then attempt to review the known forms of confirmation bias, in order to attempt to minimize the sorts of negative effects known to be caused by them, before one makes a final decision (assuming of course that circumstances allow one the time to conduct such a review).
That's something that I think I'm going to try to do a better job of in the future.
Posted by: Vitruvius at October 4, 2009 4:14 PMIt looks like the party is about over, folks, so at this point and as the sun sets slowly in the west, I'd like to thank everyone who has participated in this week's symposium. Hopefully, some readers have found some value in considering at least some of the topics that were at hand here (I know I did). I look forward to hosting another symposium here some day, until then: my best wishes to all. And as always, but usually it goes without saying: thank you Kate, for this opportunity.
This symposium is now closed.
Posted by: Vitruvius at October 4, 2009 8:52 PM