29/11/2017

Has big data really changed journalism?

 

It is often suggested that the advent of big data is transforming journalism. Yet, while journalism is indeed changing, digital data and computer technology are not as central to this transformation as is often reported. The really significant shift is not in the technology, as important as that is, but in epistemology. It is our view of human knowledge about the world – including the knowledge offered by journalism – that has changed.

As Rob Kitchin, founder of the journal Big Data & Society, argues: ‘big data analytics enables an entirely new epistemological approach for making sense of the world; rather than testing a theory by analysing relevant data, new data analytics seek to gain insights “born from the data”’. Or, as Wired editor Chris Anderson famously claimed, big data means ‘the end of theory’. Instead of formulating and testing hypotheses about how the world works, now we can just ‘let the data speak’.

According to some, this is a major advance from previous theory-driven ideas of knowledge which attempted to understand the underlying causes of phenomena. In an influential article for Foreign Affairs, for example, Kenneth Neil Cukier and Viktor Mayer-Schoenberger cautioned that claims to understand causation are often ‘nothing more than a self-congratulatory illusion’. It therefore seems like a positive development that in the era of big data we can ‘give up our quest to discover the cause of things, in return for accepting correlations’.

This implies a major challenge to journalism. If automated or ‘robot’ reporting continues to develop, it does not take a huge leap of the imagination to envisage a datafied world that could effectively report on itself, with less and less need for professional human journalism. If data-capture systems embedded into our everyday interactions are increasingly able to determine our interests and preferences, and link these to the delivery of customisable news content, we might expect that the ideal of a common public sphere, a shared space of democratic deliberation informed by journalism, would be replaced by filter bubbles.

Big data will not cause these changes, however, mainly because they have already happened – and for completely separate reasons. It may now seem plausible to argue that algorithms are better at objectivity and humans ought to stick to the personal and emotional,  but that is not really a new development; the profession has been retreating from objectivity for years. All references to ‘objectivity’ were deleted from the Society of Professional Journalists’ code of ethics in the mid-1990s, for example, at around the same time that BBC war correspondent Martin Bell declared he was ‘no longer sure what “objective” means’.

Similarly, if it appears that big data could lead to the fragmentation of the public sphere, we should recall that in the 1990s Todd Gitlin was already discussing how journalism had come to address not a unitary public sphere but ‘separate public sphericules’. And, if open-source data seems to encourage the view that ‘anyone’ can do (data) journalism, we should remember that the BBC already declared that ‘news coverage is a partnership’ after the 7/7 London bombings in 2005, in response to the importance of eyewitness camera-phone pictures and in the context of professional journalism’s declining audience and authority.

Even the seemingly futuristic idea of a datafied world reporting on itself can be understood as a big data version of much older ideas about the capitalist market as a ‘site of veridiction’ – as in Friedrich von Hayek’s view of the market as ‘a kind of gigantic information processor superior to highly limited human knowledge or the meddling of political actors’. Rather than big data causing far-reaching changes in journalism, it is more that we interpret new developments as confirming or making inevitable what we already thought.

Modern journalism emerged around the same time as modern philosophical ideas of Subject and Object, as a product of the Age of Enlightenment and of mercantile capitalism. The modern, humanist view of the active subject, investigating and transforming the world as object, was always in tension with a view of the subject as determined and constrained by objective conditions. This tension has been reflected in journalism – in part concerned with the genuine extension of public knowledge and informed democratic debate; but also partly working to narrow debate within acceptable parameters and burying active subjectivity beneath the dead weight of facts.

In the context of big data, human subjectivity tends to be downgraded in importance, even understood as just getting in the way. If the ambition to know, and to act upon and change the world now seems unrealistic, it is because we have lowered our expectations of humanity rather than because computers have made human knowledge obsolete. In journalism, as in politics, we no longer feel ourselves to be in the era of the active, history-making subject.

This article is based on Philip Hammond’s article From computer-assisted to data-driven: Journalism and Big Data, published in Journalism, Vol. 18, Issue 4, 2017.

Image: Merrill College of Journalism.

Comments