Tuesday, June 21, 2011

the Washington Post and the Web journalism

When the press was allowed to look through the 25000 e-mails exchanged by Sarah Palin when she was governor of Alaska, the newspapers faced a quandary: this piece of news was fascinating but hard to exploit with the reduced staff of the most prestigious dailies. The Washington Post, like some of its colleagues, decided to ask for volunteers to go on their behalf to Anchorage and work on this amazing amount of documents. Several hundred people were eager to do it. However, as the ombudsman of the WP had to admit, the end result was uneven. Many volunteers had no experience of journalism and ignored the intricacies of the government of Alaska. To them, most of the blogs were impossible to decipher while they were extremely clear to the few journalists specialized in Anchorage and petrol politics.

So, the ombudsman had to reckon lamely that sometimes a good journalist could do a good job. (see http://washingtonpost.com).

And it is obvious that Google is not the solution. On the website of the New York Review of Books (http://nybooks.com), Sue Halpern reveals the way the new algorithms of the Web giant function. When you look for a word, the answer you get takes into account your previous searches. It means that the answer is adjusted to the mind of the internaut. It also means that two people looking for the same words, for instance "the climate evolution", get different answers, according to their idiosyncrasies. However, they have no way to check and everybody is convinced he gets the same items than his neighbour. So Google acts exactly against the basic laws of information while pretending to inform everybody.

all that means there is a future for quality information.