User:JzG/Polarising topics

Most people who edit a Wikipedia article care to some extent about the subject. Proposed content is weighed through the filter of personal bias. If there are two axes, one going from positive to negative and defining how supportive of one's beliefs a statement is, and the other going from true to false and defining how accurate it is, most people will mix considerations of both, at least to some extent.

When something is uncontroversial, the lines of belief and accuracy are usually only slightly divergent. My opinions on Robert Hooke align closely with fact and when I read books on Hooke, or on Newton, regarded by historians as his nemesis (I oversimplify of course), I suffer little cognitive dissonance.

Everybody likes to think of themselves as rational. The ideological consonance vector is perceived by everybody as the neutral axis. Only an outsider, by definition, can place the fact/falsehood axis.

The scientific mindset seeks independent review with the aim of correcting any divergence between consonance and fact -- in other words, to revise opinion in line with objective truth. This is an ideal: few, if any, manage it perfectly in practice.

Religion unashamedly judges everything by ideological consonance. A creationist sincerely believes that evolution must be wrong, because it conflicts with belief. Natural News will publish anything that follows its agenda, even if the original source plainly identifies itself as a spoof.

In polarised topics, the ideological and factual vectors may become widely divergent. Factual accuracy is largely irrelevant to opinion.

In some cases, perhaps sports, where the magnitude of the fact/falsehood vector is small (team A may be better than team B, but it is marginal and often transient), this is unimportant. In some cases, such as the reality of global climate change, evolution, the efficacy of certain alternatives to medicine and so on, the magnitude of the truth/falsehood vector is large. There are big and objectively important facts, and big and objectively significant falsehoods.

In some cases, such as the colour of "that" dress, the ideology vector is small. Nobody has a particularly strong opinion either way and interest springs primarily from discussing the way people make up their minds. In other cases, and here we return to creationism, climate change and alternatives to medicine, the ideology vector is very strong.

The core problem with contentious topics is that people have a belief in a certain reality, which is highly driven by ideology, and their decisions are not based on empirically verified fact. Some of the evidence they claim may be true, but this is accidental, as false or distorted evidence is presented with equal fervour.

The green movement
The green movement accepts climate change but generally rejects genetically modified organisms. In both cases they cite science to support their opinion. In the case of climate change, their opinion does not diverge much from fact, but they discount cautious evidence, so they overstate the case. In the case of GMOs their opinion is largely contradicted by science: despite the impossibility of proving a negative, the scientific evidence for safety of GMOs is solid and in several decades no provable harm has arisen from their use. The result is that anti-GMO activists cite fraudulent work such as that of Séralini as if it contradicts the much larger body of evidence with its cautiously positive scientific consensus. The point is that the green movement does not actually care whether their views are objectively true, because they are subjectively - i.e. ideologically consistent. This same is true of creationists and climate change deniers.

Green activists may use sciencey-sounding arguments against GMOs, pesticides and industrialised agriculture, but examples like golden rice - an "open source" GMO with no lock-in to agrobusiness, clear health benefits and zero provable risk, allow this to be tested. And it turns out that Greenpeace, for example, are against golden rice simply because it carries the stigma of it's GMO heritage.

It may be true that the scientific consensus for the safety of GMOs, say, is not unanimous. The fallacy of false balance would say that they are not proven safe. This is like throwing a die a hundred times and concluding that because it sometimes comes out one, the average die roll is one. If you discount evidence that conflicts with your ideology, as homeopaths, creationists, climate change deniers and others do, then your view cannot change.

Markers for ideologues
The most reliable marker for this behaviour of motivated reasoning is the use of conspiracist language. If evidence conflicting with a world view is dismissed because of some vast purported conspiracy - "big pharma", say, or the claim that all positive research is funded by industry - then it is near certain that the person making the claim is an ideologue and should be ignored. They are a problem if left loose on Wikipedia because their mechanism for telling truth from falsehood is fundamentally defective, a form of incompetence.

Ideologues tend to cast everything as binary: a battle between "good" and "evil", with good being their ideology and evil being anything that conflicts. This leads them to conclude that the opposition is homogeneous, especially in motivation. It leads to the notorious "shill gambit" where the ideologue is incapable of accepting any motive for opposing their view other than being in the pay of the enemy. The shill gambit is the second most reliable indicator of an ideologue, after conspiracist language.

Science is messy
As the 2015 Sense About Science Lecture pointed out, science is messy. Striking and binary results are in short supply in biomedical subjects. Do you need five a day or four? Or eight? How bad for you is cholesterol? How good for you are statins? Is fluoridation of water supplies actually justified in the West? These are not things with unambiguous answers. In some cases there is historical research which has not been revisited when it should have been (most published research findings are wrong, after all).

Science, in its purest form, is inherently neutral. Scientists, not so much. They are human. Even Nobel laureates have been known to promote complete bollocks.

Wikipedia tends to aim for compromise between opposing points of view. In science, this is both dangerous and wrong.

It's wrong because it is the fallacy of false balance: like getting a token antivaxer to "balance" a story about vaccines, or, much worse (and of course, much more common), getting a token scientist to contradict an antivax trope resented as fact. It's dangerous because ideologues engage a ratchet effect, constantly demanding a new "compromise" between their extreme view and the current state of the article.

We have a humours essay on this, WP:RANDY. Now imagine that Randy from Boise is arguing that vaccines are used as a genocide tool, or GMOs are a plot to depopulate the third world.

So what?
So: this is Wikipedia, people trust us. We must remember that behaviour is important, but content is more important. Much more.