Tomorrow’s edition of Nature carries an opinion piece by Yale law professor Dan Kahan which urges the science communication community to take on board research that shows how our cultural views shape how we accept or reject technical information about risk.
It should come as no surprise to learn that Weltanschauung influences our reaction to factual information presented in the media. We are after all human, and scientists are no less immune to cultural prejudice than mere mortals. People recognise this, and form opinions about particular issues influenced in part by their cultural and ideological outlook. What is interesting is that the research discussed by Kahan shows how this occurs, with polarisation of opinion established by such seemingly trivial matters as the dress sense of experts interviewed on television.
Does this mean that scientific truth is in some way relative, as some postmodern ‘cultural theorists’ would have us believe? Of course not; it is a question of communication and marketing, which is as important in the dissemination of scientific knowledge as it is in political debate and commercial life. But with a proviso, as Kahan points out…
“It would not be a gross simplification to say that science needs better marketing. Unlike commercial advertising, however, the goal of these techniques is not to induce public acceptance of any particular conclusion, but rather to create an environment for the public’s open-minded, unbiased consideration of the best available scientific information.”
Kahan argues that science communicators routinely flout such considerations, preferring instead to flood the public with hard facts in the hope that truth will prevail among the masses. Now what we have here is a rhetorical device based on generalisation, but the writer makes a valid point when one averages over science public outreach as a whole.
As Kahan says, deluge people with data, and you run the risk of hardening their resistance. This may be happening now with the debate on climate change, but there’s surely more to it than that. One must also consider privileged media access by groups with vested and sometimes hidden interests, and political power relationships.
There are no solutions presented in Kahan’s essay, but it is nonetheless a valuable wake-up call. Calling for evidence-based policymaking, as many frustrated individuals in the science lobby do, is all very well, but for this to work will require a theory of risk communication that takes into account the effect of culture on personal decision-making.
Dan Kahan, “Fixing the communications failure”, Nature 463, 296 (2010)