30/10/2016

“Just because we can doesn’t mean we should”: When data viz is and isn’t appropriate

 

We have previously discussed the contributions of data visualisation to three major research communication goals: selection, engagement and uptake. The wide reach, accessibility and speed of understanding of data visualisation have been shown to help readers sift through pools of competing information and get more ‘eyeballs’ on research findings.

However...

While in many instances data visualisations have significant potential to enhance the accessibility, utility and applications of data, visualisation is not always the most appropriate way in which to tell the story of a data set. Dr David Tarrant, senior trainer at the Open Data Institute, warns:

"Just because we can visualise data (and yes you can visualise the majority of it) doesn’t mean we should"

A similar warning is articulated by Dr Tom Smith, director of Oxford Consultants for Social Inclusion:

"Data viz are not always the most appropriate communication tool. Sometimes a good analogy may be more effective”

Data visualisation is one tool in the research communication toolkit, and investment in the production of data visualisation should be selective and carefully thought through. It should not be assumed that data visualisation is the most appropriate form of communication. For example, if the purpose is to prompt an emotional reaction from the audience, other forms of communication can be more effective. Dr Zipporah Ali, director of the Kenya Hospices and Palliative Care Association, explains that telling the story of an individual’s suffering using tools that lend themselves to emotive storytelling, such as video, can be more effective at gaining support for the association than presenting ‘impersonal’ data. Shirona Patel, head of the communications department in the School of International Development at the University of Witwatersrand in South Africa, makes a similar comment:

“We are … working on a project where we need to explain the comparative benefits of a sugar tax on sugar sweetened beverages to policymakers and the public. We need to convey the tax implications and the benefits (in life years) of the proposal using data visualisations, which is proving to be much more difficult [than we thought]. It is probably easier to convey the benefits using a ‘softer more emotional’ form of communication – video, audio rather than data. We have not found the right mix yet”

To be effective, the method of communication must be tailored to the message being told and the audience being targeted. The appropriateness of data visualisation will therefore vary depending on the subject matter as well as audience characteristics, meaning that data visualisation will not be an appropriate form of communication in all instances. Fatou Gueye, academic support officer for the Education and Research in Agriculture project in Senegal, for example, warns that:

"In Senegal … the best [communication] strategies are on TV and mobile phones [and] poster campaigns”

With this in mind, there are three instances in which data visualisation can be appropriate for research communication: 1) when you have high-quality data which you want the user to explore, 2) when the audience has the capacity to effectively understand visualisations, and 3) when your organisation has the capacity to produce visualisations.

High-quality, complex data

Decisions about investments in data visualisation must begin with the quality and availability of data. Unless accurate and complete data is available, the visualisation will lack integrity and authoritativeness and could misinform readers. Hassel Fallas (Costa Rica) explains:

“If we are working with data, it should be done with rigour and accuracy, verifying the data [just] as [with] any other source. We need to ensure the precision and trustworthiness of data that will be presented as a fact to our readership”

Even if high-quality, accurate data is available, data visualisation is not necessarily the optimal form of research communication. For example, where a data set tells an uninteresting or unsurprising story that could be communicated in a few sentences, investment in a data visualisation is unlikely to pay off.

However, where an interesting and high-quality data set exists, visualisation can be an appropriate form of research communication. In particular, data visualisation can be an [optimal format](https://reutersinstitute.politics.ox.ac.uk/sites/default/files/Making%20Research%20Useful%20-%20Current%20Challenges%20and%20Good%20Practices%20in%20Data%20Visualisation.pdf) when the data set is complex (e.g. contains a number of different variables showing a number of patterns and trends). In this instance, an interactive data visualisation which enables the user to sort, filter and explore the data can be appropriate as it enables more information to be communicated in limited space compared with text.

Audience capacity

Audience capacity is also crucial in determining whether data visualisation is an effective form of research communication, because visualisations require users to have certain skills.

The appropriateness of data visualisation as a communication tool also depends on audience capacity, for data visualisations demand certain skills of the reader. ‘Graphicacy’ (the ability to understand and use a map or graph) is often taken for granted, but not all audiences have the same degree of familiarity with or understanding of graphs and charts, particularly where such graphs and charts use advanced techniques.

The need to align data visualisations to audience capacity has been highlighted in the Seeing Data research project, led by Professor Helen Kennedy. Through focus group research, interviews and diary-keeping which assessed how readers engage with data visualisations, Kennedy found that confidence and skills related to things such as language, statistics, visual literacy, computers and critical thinking all affected users’ engagement with data visualisations. Where visualisations were not aligned to the audiences’ capacity, engagement with data visualisations used in the research was low. This was demonstrated through participants’ reactions during Kennedy’s research:

“It was all these circles and colours and I thought that looks like a bit of hard work; I don’t know if I understand” (Sara, 45, a part-time careers advisor)

Kennedy also found that the time the audience has available to explore the visualisation affects levels of engagement and has implication for its design (how complex the visual should be and how prominent the key messages should be). A research participant in Kennedy’s research explains:

“Because I don’t have a lot of time to read things […] if it’s kept simple and easy to read, then I’m more likely to be interested in it and reading it all and […] to have a good look at it” (J.C., 24, agricultural worker/engineer)

As well as skills and time, audience capacity extends to external factors including access to technology and bandwidth availability. Elaborate interactive data visualisations may have the potential to engage the audience and promote research uptake, but if technological limitations prevent access and use, the potential contribution of the data visualisation to the objectives of the publisher will not be realised. Shirona Patel, from South Africa, notes:

“In SA, more and more people have access to cell phones and smartphones. However, most websites are not built to be responsive so often the use of tables, graphs and data is discouraged as they ‘fall off the pages’ from websites that do not adapt to mobile phones”

Decisions regarding the production of data visualisation must therefore be guided by the audience’s capacity to access, use and understand visualisations.

Teamwork and organisational buy-in

Finally, decisions regarding investments in data visualisation are influenced by the degree of organisational buy-in. The production of data visualisation – particularly more-complex, interactive examples – requires an array of different skillsets, many of which are not traditionally associated with research communication. Among the skills required are data analysis (to clean data and draw out trends and relationships within the data), visual design (to create a visually appealing product), digital skills (to create the product) and storytelling or journalism skills (to explain the significance and implications of the data).

As well as investing in training and making the space for experimentation, the production of data visualisation also often requires staff with different skillsets to work together on the production. [Ricci Coughlan](https://twitter.com/riccicoughlan), senior designer at the UK Department for International Development’s Creative Content Team, explains:

“Having data scientists and statisticians who are data visualisation literate is a big challenge, as is finding graphic designers who are data literate. Creating high-quality visualisations tends to be the result of great teamwork between a number of different skillsets, no one individual can be expected to do everything … I think many organisations will likely already have these skills (designers, researchers, writers, statisticians), but they may have just not got them working as a team before on such a product. They may all be used to contributing their skills to a report but less experienced in working with visualisations together”

Teamwork is therefore essential to ensure that the data visualisation does not mislead or misinform the reader. If the integrity of the visualisation is compromised through poor data analysis, poor design or poor storytelling, its potential to contribute to research communication goals.

This article was derived from SciDev.Net’s learning report Data visualisation: Contributions to evidence-based decision-making. Edited and published with permission.

Image: Gal.

Comments