Help - Search - Members - Calendar
Full Version: Confirmation bias
> Wikimedia Discussion > General Discussion
Cock-up-over-conspiracy
Nothing comes up in the search, so here it is:

Confirmation bias, the tendency of people to seek evidence confirming an already held opinion and to avoid looking for that which might upset their carefully constructed mental models. Even in academic science, the area in which one might expect confirmation bias to be the least effective, scientists have generally proven remarkably reluctant to give up discredited theories.

It occurs more than often subconsciously.

The standard explanation is that people are motivated to find facts to fit their own theories, to maintain consistency of their internal concepts. An alternative view is that people make these mistakes because their cognitive capabilities are limited, able to think only about one thing at a time. Some people just don’t have the ability to handle more than one alternative. Having their theory challenged is having their person challenged, so deep is the connection.

Further more, post-decisional dissonance relating to the decision made and the possibility of being wrong occurs motivates individuals to find even more evidence to support their position thereby entrenching their position.

Evaluating evidence in order to come to an unbaised conclusion is the theoretical practise in knowledge building. Sadly, on the Pee-dia, entirely conscious selectively gathering material giving undue weight to one's position is the name of the game.

Extensive empirical evidence support the idea and also suggests that once a position has been taken, individuals' primary purpose becomes defense or justification meaning that they become even more highly biased.

People have a tendency to overestimate the probable accuracy of their judgments and hold onto their positions even when they are failing. Studies suggest people are twice as likely to seek out information that confirms their position than challenges them.

I can be used to one's advantage though ...

Having persuaded a person of something make them feel better by letting them find examples that confirm the given example.

However, the flipside to this is ... beware of people doing it to you by feeding you confirming evidence.

Someone who knows more than me ... ConfirmationBias.pdf
Milton Roe
QUOTE(Cock-up-over-conspiracy @ Sun 25th April 2010, 4:57am) *

Nothing comes up in the search, so here it is:

Confirmation bias, the tendency of people to seek evidence confirming an already held opinion and to avoid looking for that which might upset their carefully constructed mental models. Even in academic science, the area in which one might expect confirmation bias to be the least effective, scientists have generally proven remarkably reluctant to give up discredited theories.

It occurs more than often subconsciously.

The standard explanation is that people are motivated to find facts to fit their own theories, to maintain consistency of their internal concepts. An alternative view is that people make these mistakes because their cognitive capabilities are limited, able to think only about one thing at a time. Some people just don’t have the ability to handle more than one alternative. Having their theory challenged is having their person challenged, so deep is the connection.


ANY information-processing system has to filter and toss information in a cascade of steps through many hard-wired AND learned pattern recognition systems that each amount to a confirmation "bias."

It should be noted that a slightly expanded form of this is totally unavoidable in science, as the world is far too complicated a place to simply collect all data at random, as Charles Darwin was one of the first to point out. You don't just wonder out into the world and start measuring and describing the first thing you see, then move on to the next. You'd never make it past your front yard in your whole life, and your work would be useless.

Scientific data is ALWAYS collected through the filter of trying to disprove or "strengthen" a given questioned or topical theory/hypothesis, or at most a very, very limited sheaf of them. This is nobody's "fault." It's not only forced on us by the way the brain is constructed, but by the way the brain MUST be constructed, if it weren't to overload and lockup with information overload. Nothing to be ashamed of, here. A computer, no matter how complicated, has the same problem, and must employ the same heirarchical cascading filter->filter->filter-> solution. Animals also.

Some of this is why WP's policy of NPOV is funny. Even once you get past the idea that it doesn't require NPOV in the sources, the writer of the article still has to filter->filter->filter->filter to synthesize an article from whatever number of sources are available (which in turn and often far too many to read in a lifetime-- good luck with the American Civil War, when one book as been printed at least per day, for every day SINCE the war happened...). So "confirmation bias" is unavoidable. A filter is what an encyclopedia IS. Each filter IS a bias. And it's a bias IN THE WRITER. There is and can be no escaping it.
BelovedFox
QUOTE(Milton Roe @ Sun 25th April 2010, 6:52pm) *

QUOTE(Cock-up-over-conspiracy @ Sun 25th April 2010, 4:57am) *

Nothing comes up in the search, so here it is:

Confirmation bias, the tendency of people to seek evidence confirming an already held opinion and to avoid looking for that which might upset their carefully constructed mental models. Even in academic science, the area in which one might expect confirmation bias to be the least effective, scientists have generally proven remarkably reluctant to give up discredited theories.

It occurs more than often subconsciously.

The standard explanation is that people are motivated to find facts to fit their own theories, to maintain consistency of their internal concepts. An alternative view is that people make these mistakes because their cognitive capabilities are limited, able to think only about one thing at a time. Some people just don’t have the ability to handle more than one alternative. Having their theory challenged is having their person challenged, so deep is the connection.


ANY information-processing system has to filter and toss information in a cascade of steps through many hard-wired AND learned pattern recognition systems that each amount to a confirmation "bias."

It should be noted that a slightly expanded form of this is totally unavoidable in science, as the world is far too complicated a place to simply collect all data at random, as Charles Darwin was one of the first to point out. You don't just wonder out into the world and start measuring and describing the first thing you see, then move on to the next. You'd never make it past your front yard in your whole life, and your work would be useless.

Scientific data is ALWAYS collected through the filter of trying to disprove or "strengthen" a given questioned or topical theory/hypothesis, or at most a very, very limited sheaf of them. This is nobody's "fault." It's not only forced on us by the way the brain is constructed, but by the way the brain MUST be constructed, if it weren't to overload and lockup with information overload. Nothing to be ashamed of, here. A computer, no matter how complicated, has the same problem, and must employ the same heirarchical cascading filter->filter->filter-> solution. Animals also.

Some of this is why WP's policy of NPOV is funny. Even once you get past the idea that it doesn't require NPOV in the sources, the writer of the article still has to filter->filter->filter->filter to synthesize an article from whatever number of sources are available (which in turn and often far too many to read in a lifetime-- good luck with the American Civil War, when one book as been printed at least per day, for every day SINCE the war happened...). So "confirmation bias" is unavoidable. A filter is what an encyclopedia IS. Each filter IS a bias. And it's a bias IN THE WRITER. There is and can be no escaping it.


What always got me was hammering the "no synthesis" line at people. Those who don't know any better don't realize that there's the type of original research that is bad, and then the kind of synthesis you describe that is essential to actually writing a good article.

If people just accepted that we're not infallible, the world would be a better place tongue.gif Most of the POV issues I find are people who just didn't go far enough looking for sources. Then, of course, there are those with a deliberate aim.
This is a "lo-fi" version of our main content. To view the full version with more information, formatting and images, please click here.