In search of consensus in a world of scientific uncertainty: An exploration of journalistic practices

We live in a world full of uncertainty and change. When it comes to scientific knowledge, there is a common misconception that it is fixed, representing a certain exiting truth, when in reality, science is an ever-changing landscape of puzzles and mysteries. Journalists have long been a trusted agent in the dissemination of science information, but it isn't clear how they deal with the ambiguity and shifting truth of the scientific literature. Since they have the power to impact the public's perception of scientific issues, it is very important to understand how they deal with this uncertainty and constant change.

Through a textual analysis of contradicting news coverage and in-depth interviews of both general assignment and science journalists, this study explores these questions, with a specific focus on how journalists approach research for an article, evaluate sources of information and deal with contradiction. The themes emerging from these conversations speak to professionals dealing with a changing world, evolving media environment and the need for clarity.

Impacted by pressures to publish quickly and attract maximum attention, science journalists struggle to stay true to their professional principles of conduct and to the science itself. This study found that journalists do not have a clear plan of how to deal with contradictory science messages. Instead, they aim to look past all the contradiction and find scientific consensus, and then try to represent it in a way that is both true and understandable to the public. Various factors impact how successful they are, including their level of scientific training, time and word limit allotted to the content production and ability to resist exaggeration and oversimplification. By understanding how traditional media deals with contradictory information, we gain valuable insight into what shapes the nature of information online, which in turn impacts audience perceptions on a specific subject.


With the rise of media technology in the recent decade, there has been an unprecedented increase in the speed and scope of access to health information for the general public (Hesse at al, 2005; Cline & Haynes, 2001; Panth & Acharya, 2015). In addition, the cost of entry into the realm of information creation for the previously more passive audience has also become very low. Due to these factors, there is more information available than ever before, with professionals as well as amateurs serving the roles of creator, curator and consumer simultaneously. Thus, when one encounters contradictory information, it has become increasingly difficult to evaluate the credibility of each source and evaluate the contradiction. In no sector is this more apparent and important than in science and health, where misperceptions could affect beliefs and even behavior (Niederdeppe & Levy, 2007).

Though previous work has established the existence of contradictory information in relation to science and health stories (Nagler & Hornik, 2012; Nagler, 2014;  Greiner, Smith, & Guallar, 2010;  Smith, Kromm, & Klassen, 2010; Squiers et al., 2011), as well as the effect it has on the general public (Nagler & Hornik, 2012; Niederdeppe & Levy, 2007; Greiner, Smith, & Guallar, 2010;  Smith, Kromm, & Klassen, 2010; Squiers et al., 2011; Jensen et al, 2011), work on what effect contradictory information has on journalists is lacking. Journalists, serving as traditional gatekeepers, have long been trusted with the creating and curation of information (Lewin, 1951; Shoemaker & Vos, 2009). As media technologies have evolved, they have lost exclusivity in that realm (Barzilai-Nahon, 2008). Their role, though evolving, is still a vital part of the process of dissemination of information. Thus, it is important to understand how they navigate the changing media landscape while also dealing with the ever changing, constantly expanding and often contradictory scientific research environment.

This study seeks to help understand contradictory science information through (1) an illustrative case study of two contradictory articles from The Observer in relation to coffee consumption and (2) a discussion of themes that evolved through in-depth interviews with a select group of four both general assignment and specialized science journalists. The findings of the case study of coffee consumption speaks to a bigger trend of distrust in science, and science in crisis (Jamieson, 2018). The contradiction evident in the comparison of the two articles is represented to the readers as a false alarm claim, in plain terms: it doesnt matter, because science doesnt really know and might change its mind soon, so go about your day as usual. This is illustrative of a core misunderstanding of the fact that science is a constantly evolving process of discovery and nuanced discussion. This nuanced discussion, though, is a challenge to achieve even in professional journalism.

The journalists I interviewed expressed difficulties in navigating the constantly changing media environment. Faced with immense pressure to publish, journalists had with minimal time for close reading of the research or fact checking their sources. The interviewees stated that it was not common to have time to read the entirety of the original source before utilizing it in writing an article for publication. In the discussion on how journalists chose and evaluated their sources, the general assignment journalists placed the most emphasis on personal judgement and industry connections, while the science journalists valued time to do the research themselves, with help from scientific experts. Science journalists seemed more skeptical of getting quick expert opinions to beef up a story, preferring instead to do their own research, looking for the greater scientific consensus. But, even if they understand the science and want to represent it accurately, the interviewees expressed that it was a struggle to find the right words. Both dumbing things down as well as using the correct scientific terminology both poses issues for the readers: if they use terminology too complex, they will lose the readers; if they use terminology too simple, they lose the science.

This study as a whole illustrates that, though the internet has allowed not only access to content, but access to content creation to an entire population of the lay public previously passive, it has also overwhelmingly complicated journalistic norms and practice. It seems that the world of traditional journalism should adapt to the changing media environment and learn to find the balance between click appeal and scientific accuracy.

Journalists specializing in science-related stories in particular have their work cut out for them, since the world of science is in constant flux with new scientific studies being published constantly, some of which may be very technical and even seem contradictory. Though, science is often romanticized as a flawless system of knowledge-building, it is not always the case (Matosin, et al, 2014). Though uncertainty and contradiction are a normal, everyday feature of our daily lives, scientific uncertainty is not something commonly understood by the lay public:
Name the issueAIDS, climate change, genetic engineering, whether the universe is a closed or ever-expanding system, cloning, the impact of nature versus nurture on behavior, the basic nature of matter, pesticide effectsand you will find it filled with areas where scientists do not always know what to expect or predict (Friedman, Dunwoody & Rogers, 1999).

In science, it is understood that hypotheses are never proven to be absolutely true. Truth in science can be defined as the working hypothesis best suited to open the way to the next better one (Lorenz, as quoted in PLOS, 2005). Science is constantly evolving, each little study adding to a puzzle that may never be fully solved, the edges constantly expanding. Falsifiability of the hypothesis, reproducibility of methods, constructive criticism, humble uncertainty in findings and the acknowledgement of limitations are all a vital part of the process. In contrast, journalistic best practices seem to focus on the gathering of concrete facts from experts, which they then report to the public in short bursts of actionable information, news you can use (Hitlin, P. & Olmstead, 2018). Nuance seems beyond the scope of the story, and oftentimes journalists do not even feel that it is their duty to educate the public or promote the benefits of science (Allan, 2011). This, coupled with the decrease in specialized science journalists in the field, led to the uncertainties typical of scientific studies, such as limitations of the research, to be left out of media coverage (Brody, 1999). Furthermore, when scientists discuss nuance or possible problems with the research, the media are often quick to process that into narratives of science in crisis (Jamieson, 2018).

Through in-depth interviews with four both general assignment and science journalists, I was able to gain additional insight into the workings of the journalism industry in general and science news reporting in particular. This will be illustrated through the discussion of 5 key themes extracted from the four 45-60 minute interviews: (1) Journalistic Norms and Best Practices, (2) Time Constraint, Pre-publishing and Corrections, (3) Evaluating Source Credibility and Looking for Consensus, (4) Training Science Journalists and (5) Encoding  Decoding. Though overlapping in certain aspects, each of these key themes illustrates a constraint or incentive the interviewees expressed as important in impacting their content creation.

Journalistic Norms and Best Practices

Information  in the media is very rarely just given as is, but is interpreted and framed, in other words  transformed by the constant curation by the journalists. Though we may think of journalists jobs similar to reporters, there does not seem to be a strict consensus in the field over what the role of journalists is: to present facts, to provide an interpretation and context of facts, or where the line is between the two. Since journalistic know-how is not necessary standardized, official practice, but more of a matter of experience, personal preference and based on the location, scope, category and popularity of the media source, whether the journalists are mere reporters of facts or active curators of information may be dependent on a variety of factors, one of which may or may not be dependent on the expectations of their audience.  

Journalists select and frame stories for publication using criteria like frequency, threshold, unambiguity, etc. (Galtung & Ruge, 1965), although some have offered modifications of these criteria for contemporary journalism (Harcup & ONeill, 2001; Brighton & Foy, 2007). Identifying real news using specific criteria is a valuable but often problematic approach. Taxonomies of news values, like the ones offered by Galton and Ruge or Harcup and ONeill, may be accurate in explaining selection of news after the fact, but it is not unusual, when asked how to define news, for the journalist to say something like, I know it when I see it (Harcup & ONeill, 2017). This came up during conversation with the interviewees, stating that when choosing stories, they would use literal criteria and like just like kind of in your head like you know like intuitive criteria, that you would have about what is worthy of coverage. The effects of personal preferences, biases and desire for clicks, though acknowledged, is seen as a violation of norms. Whether it is using your own opinions to frame stories, there is still. I think the voice of journalists are still worthy of being heard, but it should be clearer on what is being done, or going against the idea of presenting both sides of the story, there's this idea in journalism that we have to have a balanced view of both sides but there are clear situations where like there are In science, in everything, there will always be [] someone who doesn't see things the way everyone does, oh I mean always. So, to give that 3 percent as much space as the 97 percent is like you know I don't think that that makes sense. This tension between traditional norms and best practices mirrors the move from traditional gatekeeping to the expansion into network gatekeeping, where norms and practices have to adjust not only to the vastly bigger amount of information available for consumption, but also to the speed in which it needs to be produced.

Time Constraint, Pre-publishing and Corrections

The heightened speed of production that came along with the Internet age, forcing journalists to produce multiple stories a day and leaving minimal time for close reading of the research or fact checking their sources. The interviewee stated that it was not common to have time to read the entirety of the original source before utilizing it in writing. The literature on the subject gives further support to the existence this practice: In her book Making the News at the New York Times, Nikki Usher describes a typical day in the life of three journalists at the NYT. One of them, Graham Bowley, was covering what was (falsely) predicted to be a huge story over how much money Goldman Sachs had managed to earn in 2010. In an effort to keep up with the speed of the news cycle, Bowley had published a story in the morning, and then throughout the day and continued to update the already published story online as new information and corrections came in from various sources, rushing to file the next addition to his work with every new source that called (Usher, 2014). The story went through twenty-six updates that day. With every other news outlet covering the story as breaking news, the NYT could not wait to publish, but with constant revisions and rush to publish, it opened up the opportunity for error, oversimplification and conjecture based on information, regardless of how credible the source was.
The speed of the internet age of journalism creates the opportunity for a vast number of errors to be made. As discussed in the interviews, if a journalist is writing multiple stories per day, they simply do not have time to read the original scientific study, get all available consultants or expert informants on the phone, fact check all the information and have it edited before the story needs to go live on the website, preferably before any competitors. One of the science journalists in particular expressed frustration at the time constraint:
that's another one of my pet peeves, like don't tell me I have to turn around a 3,000-word article on a very important subject in 24 hours thats just bad, that.. you can't do a good job in that amount of time. Even a thousand-word article, some issues just need time you know, most issues

The rush to publication also causes an interesting phenomenon in the internet age: the ability to publish a placeholder story until the real information was available to fill in. One of my interviewees even caught this happening live:
I saw them write an article about like I had just watched this hearing about something and they had published an article about it. I was like that was not what that woman said, you know. So, I emailed the author and they're like oh we're changing that. Just like . like that was an editing error or something. And [] there was this study recently that like false news travels farther and faster than truth, and it's like I don't know,  [] it makes me feel very sad sometimes to be part of that community because [] I think we're losing track of what our job is in a lot of ways. Our job is to accurately inform the public, not to be the first to get a story out.

This theme also made an appearance in Ushers book. Reporter Javier Hernandez, just like Bowley. Hernandez was reporting the current unemployment numbers and, in the early morning, the placeholder headline at NYT online stated 36,000 Jobs Lost in February; Rate Steady at 9.7 Percent. As more research was done, the headline changed to U.S. Job Losses in February Obscure View of Recovery. Later that morning, the headline drastically changed to: Jobless Rate Holds Steady, Raising Hopes of Recovery. What happened was, another reporter had come into the newsroom and, with a quick look at the numbers, told the team working on the story that they had gotten it wrong: these numbers were good news for the American economy. The headline and lead live on the NYT website had been wrong all morning. Immediate is not always right, Usher argues. This is an important lesson, as readers may not read the story again; policy makers may begin to issue statements; and, in the case of business news, financial decisions may be affected (Usher, 2014). In the context of a scientific study, even if we assume the study had been conducted perfectly, and the peer review process was handled without any error, both of which are unlikely (Ioannidis, 2005), it seemed unfortunate to say the least that the findings would be injected into the internet with a false interpretation because the journalists were in a rush to get the story out. A question natural arises here, if there were mistakes, what is to be done?

The idea that emerged from the interviews was that if or when mistakes were caught in a story after publication, it was expected that they should be corrected. This, though, was not done with enthusiasm, unsurprisingly. Though fact-checking and corrections should be a necessary feature of all professional written content, by admitting you made a mistake, there is a fear that you might lose credibility as a source of information in the eyes of your readers. Though this is a theme that is common between journalists and scientists, a voluntary retraction of a scientific study does not necessarily forever discredit you as a scientist (though of course, it still has a certain amount of stigma attached). Self-correction, though, may have a stronger detrimental effect on media trust.

Evaluating Source Credibility and Looking for Consensus

In the discussion on how sources are chosen and evaluated, the general assignment journalists placed the most emphasis on personal judgement and industry connections, while the science journalists valued time to do the research themselves, with help from scientific experts. Its a traditional journalist practice to have informants on speed dial, who they could get on the phone and ask for comment on the story. Science journalists seemed more skeptical of getting quick expert opinions to beef up a story, preferring instead to do their own research looking specifically for scientific consensus:
It starts with like just using Google Scholar and seeing what's around, I try to avoid it but I will also read like other articles that are written on the subject, like in the mainstream media but I look I look for those more to see what they link to than what they are... [ ] , I spend a lot of time just for reading, a lot, so I wont just read like one meta-analysis or one paper, Ill read a few of them and if they're all kind of the same thing then, I think OK, well then there really is consensus for this it's not just this one study or even just this one meta-analysis. I often because I know that science changes so fast, I often look for the most recent I can possibly find, like if I see anything that's like [] older than 2010 I am skeptical of, I want like something to back that up, that's not enough for me. So, yeah I read a lot, and then obviously also reach out to experts, but I will also say that I am skeptical of what experts say. I'm not saying that I don't believe them, I just know that they're human beings and that they have a certain perspective and that's not always the case, there are certain issues where [] I couldn't rely on experts, I really had to use my own judgment.

This notion of looking for consensus seems to be at the heart of research for science journalists, and seemed to be the only way they could explain how they would deal with contradiction. Looking at all the evidence and seeing what is credible, and what is in the minority, or outside the consensus. Apart from that, when questioned the journalists struggled to explain how they deal with contradictory messaging, saying that it really depends on the situation. This is interesting, as it seems like there is no formal procedure they can rely on to get them through these types of stations, leaving it up to every individuals personal sense of truth, levels of scientific understanding and experience to make a decision who to believe.
This brings us to the theme of training and experience more generally. It seems to stem from journalistic vs scientific conventions of whether experts or personal expertise and judgement are taken as the most important step in evaluating evidence. Either way, the story needs to be put in a way that the target audience  the general public  will be able to consume it. But, there is concern that by trying to make science simpler, more actionable, and approachable for the general public to understand, the essence of the scientific process is misrepresented and misapplied. Then, when a new study comes out with different findings, it seems like science is losing its credibility as a source of facts and experts.

Training Science Journalists

Though it seems to be more common now to expect journalists to have a degree in media or communication studies, this is not always the case. Science journalists in particular may be less likely now than before to have a specialized education, which could lead to misunderstandings of the basic ideas on which they are expected to report on. Conversely, graduates from science fields may enter the journalistic workplace with specialized knowledge, but lack of understanding of journalistic norms. For newer journalists and students, it seems especially difficult to start to navigate this constantly shifting environment where there are no standard, generalizable rules of conduct overtly stated. When faced with either options, interviewees stated that scientific training was more important than journalistic training:
Anyone who wants to be a journalist I would say you know, I don't want to say you shouldn't get a degree in journalism. You can, it's one way to go about it and I do see how you can like more easily make contact in that way, but. Especially if you're going to be a science journalist, having a background in science like there are so many people there that write about science, but don't understand how the process works at all, and they still kind of think about it how they were taught it in high school, like oh this is a fact, there's no like nuance to that information and I think thats what in large part what leads to a lot of bad science journalism.
 This leads us back to the idea of scientific knowledge and understanding the nuances of scientific inquiry. Scientific studies rarely, if ever, state definite facts, absolutely generalizable findings and actionable advice for consumers of that information. A journalists interpretation when writing that story is what brings that perspective on a story, and one reason  of misinformation in this realm could be their misunderstanding of the practices of science. One interviewee (science journalist) expressed her concern with nutrition studies being misrepresented:
[sometimes] theyre just bad studies. like a sample size of 15 people and youre going to generalize it like to the general, like everyone. That happens a lot and then those studies are reported as if they're like, as if you could just put in the word may and then it's ok. Its like, no, you just shouldn't report on that. You know just stop reporting on coffee and wine
Encoding  Decoding
Frustratingly, though, for the interviewees, both dumbing things down as well as using the correct scientific terminology both poses issues:
I use the language supported by the evidence, risk factor, but when people read it thats not how they interpret it. [] They interpret risk factor as cause, they interpret supported by the evidence as proven, and they interpret like no evidence just like its not happening, when it just truly means there is not enough evidence. That's something I've come across actually a lot recently like this idea when there is no if a study says there is no evidence for, or evidence is very weak for a specific phenomenon , thats in large part because there are no studies done on it or the studies are conflicting, that will get changed into its not happening in the mainstream media.
If they use terminology too complex, they will lose the readers. If they use terminology too simple, they lose the science. when you get into the nuance, its less likely for you to keep a reader, [] but the simplification of information like really just makes it wrong. The solution, then, seems to be for journalists to take on almost an educational, science literacy approach by using terminology representative both of the science and one that aims to expand the publics understanding. But even when neutral, true terminology is used, it may be interpreted in ways not intended.
VIII.    Conclusion
Has all this proliferation  growth in science communication actually benefiting society? I dont know was how one of my interviewees ended our discussion. Certainly, we are exposed to more information than ever before and are able to access it faster than ever before. But has it become a question of quantity over quality? As media technologies have evolved, journalists have had to navigate the changing media landscape while also dealing with the ever changing, constantly expanding and often contradictory world of scientific research. This study contained a discussion of themes that sheds some light on the processes how journalists research for information, evaluate sources and produce their own content in the new media area. Even with the shifting roles of exclusivity in content creation, journalists remain a trusted source of information dissemination. With an increasingly complex terrain of constantly evolving scientific research, it is more important than ever to understand the constraints and incentives that govern their work.
The journalists I interviewed had concerns about increasing time pressure to publish higher quantity of content in an increasingly short amount of time. Contradictory information was not viewed as a difficulty in research. Their focus was on understanding and being able to convey information, whether it be with help from scientific experts or their own research skills. Journalists need to learn to find the balance between click appeal for readership purposes and scientific accuracy for educational purposes. Prior science training seems also to be an important differentiating factor standing between conveying true or false information, though more research needs to be conducted examining the de-professionalization of the field of science journalism and how that has affected the media environment. If scientific training was required or at least recommended for all those who cover science topics, we might see a decrease in seemingly contradictory popular media coverage overall.
The key findings of this study is that journalists do not have a clear, standardized game plan of how to deal with contradictory messaging, and it needs to be established for consistency and accuracy purposes as an agreed-upon, universal journalistic norm. In the absence of this norm, journalists rely on finding scientific consensus, and then try to convey it to the public with various levels of success. With further research, this insight may help us form a better understanding on what factors shape the nature of information we are all exposed to online, impacting our perceptions and our behavior.