44 Are enabling technologies “neutral”?

Posted on January 8, 2012 by


In a previous post I discussed Mr. CERF’s catchy phrase in his NYT[1] Op-Ed: “technology is an enabler of rights, not a right itself”. Mr. CERF seems to have a dualist worldview: here there is “enabling technology”, which is “neutral”, and there are “rights”, which are political – and never the twain need meet. This is Platonic phantasy-land.

Take a look at the real world. Consciousness and brain cannot be separated neatly – there is no homunculus (the soul) running our brain (the body): everything is intertwined[2]. As any IT engineer knows, hard- and software are made for each other, and are intrinsically joined, and they co-evolve.

Let’s look now at the internet and consider the issue of “unbiased access to information” on the web.

Raw information is never innocent, we know that. There is a close fit with mainstream ideology. History is written by the literate and the victor – the elite – and alternative views simply go unrecorded or are suppressed[3]. Increasing the sources of information, in particular through social networks, may help, though at the expense of verification (mankind being exceedingly prone to illusion and gullibility, we are suckers for plausible lore, particularly when unverified).

So far information is manipulated “at the source” – which is bad enough. Given its mass in the WWWeb, information no longer is directly available, however; it is mediated through search engines like Google, bing, or whatever[4]. Information is thus manipulated a second time – by the search engines.

The algorithms that underlay the working of these engines reflect all sorts of plausible rules as well as commercial concerns: those of the provider, who offers his search engine for free, and those of the advertiser, who aims for as many hits as possible. What is the outcome for the goal of “unbiased access to information” on the web? I don’t know, and I suspect the providers don’t know either.

Amazon provides me with an inkling of what might be going on. If I choose an author, Amazon will recommend other books by the same author. If I chose a topic, say a biography, it will suggest clones. Why Amazon thinks I’d go for duplicates is a mystery to me. Fortunately for Amazon, it allows private reviewers. For me their input is critical. My own algorithm is simple. I tend to read negative reviews: they are usually few; either they are decidedly bunk and easily spotted, or they have a point worth noting. The precautionary principle kicks in. Quite often these critical reviews suggest alternatives. Posted reading lists are also useful: together with the reviews they allow rapid survey of the topic. In the end I get what (I think) I want – despite Amazon’s well-meaning but ineffectual recommendations.

My hunch is that search engines implicitly yet effectively favor the “mainstream” over diversity and tend to confirm prejudice over challenging it. They strengthen and support authority as against valid arguments. To the extent that they are driven by ex-ante and self-fulfilling choice criteria like “most popular” they have limited informational validity.

If the situation is as I suspect it, search engines create positive feedbacks that validate the “most popular” news item over the truth. One of the most common human is heuristics is: “do as others do – they may know”. It is rational. If someone yells “fire”, it is better to step out of the theatre rather than verify. When all use this same heuristic, however, we get a stampede. Mankind’s great strength – the wisdom of the crowd[5] – is nullified.

Search algorithms may create informational stampedes, and in an internet age such stampedes soon become informational firestorms that engulf any discussion. Add to this our inherent tendency (and ability) to infer from imprecise information – hence to skip the reasoning in order to applaud the catchy phrase. Again, this ability – what we have here is a basic Bayesian inference module – may be critical for long-term survival. In the here and now it may lead us terribly astray: self-delusion may be reinforced because of our tendency to jump to conclusions[6].

Is this all bad? It depends.

In science “truth” is not obtained by validation or plebiscite. All knowledge is what has not been falsified – yet. Science “advances” through falsification, and search engines that make it more difficult for falsifiers to emerge, because they are either relegated to the bottom of the search heap or down-right suppressed, hinder science.

In the political sphere, on the other hand, consensus on action (or inaction) in the face of risk and uncertainty is the goal. Instruments that facilitate consensus may be useful – or manipulative[7].

At the moment I’m not allowed to input the search algorithm: I’m totally in the hands of the service provider. Ideally, I should be able to manipulate the algorithm in accordance with my needs. Alternatively the service provider may need to explicit his criteria, so users are aware of the biases underlying the search. Remedy may be advised. Or, finally, a political discussion over the criteria underlying the search engine may become necessary, in order for policy to influence the formulation of the algorithm.

Whichever the way forward – the “search technology” for one is NOT neutral and separate.

[2]              For a recent summary of the state of the art see Michael S. GAZZANIGA (2011): Who’s in charge? Free will and the science of the brain. Ecco, HarperCollins, New York.

[3]           By definition, what has been erased no longer bears witness. From time to time ironic history allows us an inkling of what chance or malice has destroyed, and it is enough to cast doubt on the accuracy and trustworthiness of received history. See e.g. Adriano PROSPERI (2011): L’eresia del Libro Grande. Storia di Giorgio Siculo e della sua setta. Feltrinelli, Milano.

[4]           While alternative search engines exist, we tend to favor one. Convenience, but also our tolerance for approximation account for such behavior. This can lead to grievous prejudices and errors of judgment.

[5]           James SUROWIECKI (2004): The wisdom of the crowds. Why the many are smarter than the few. Little Brown, New York.

[6]              Daniel KAHNEMAN (2011): Thinking, fast and slow. Farrar, Straus and Giroux, New York.

[7]           “If you tell a lie big enough and keep repeating it, people will eventually come to believe it. The lie can be maintained only for such time as the State can shield the people from the political, economic and/or military consequences of the lie. It thus becomes vitally important for the State to use all of its powers to repress dissent, for the truth is the mortal enemy of the lie, and thus by extension, the truth is the greatest enemy of the State.” Joseph GOEBBELS.