Skip directly to site content Skip directly to page options Skip directly to A-Z link Skip directly to A-Z link Skip directly to A-Z link
Volume 3, Number 4—December 1997
THEME ISSUE
Foodborne
Identifying and Anticipating New Foodborne Microbial Hazards

Communicating Foodborne Disease Risk

Baruch FischhoffComments to Author  and Julie S. Downs
Author affiliations: Carnegie Mellon University, Pittsburgh, Pennsylvania, USA

Main Article

Table

Thought processes involved in decision-making

People simplify. Many decisions require people to deal with more details than they can readily handle at any one time. To cope with the overload, people simplify. People want to know if foods are "safe," rather than treating safety as a continuous variable; they demand proof from scientists who can provide only tentative findings; and they divide the participants in risk disputes into good guys and bad guys. Such simplifications help people cope, yet also lead to predictable biases (7).
Once people's minds are made up, it's hard to change them. People are adept at maintaining faith in their beliefs unless confronted with overwhelming evidence to the contrary. One psychologic process that helps people to maintain their current beliefs is underestimating the need to seek contrary evidence. Another process is exploiting the uncertainty surrounding negative information to interpret it as consistent with existing beliefs (8).
People remember what they see. People are good at keeping track of events that come to their attention (9,10). As a result, if the appropriate facts reach people in a credible way before their minds are made up, their first impression is likely to be the correct one. Unfortunately, it is hard for people to gain firsthand knowledge of many risks, leaving them to decipher the incomplete reports they get.
People cannot readily detect omissions in the evidence they receive. It is unusual both to realize that one's observations may be biased and to undo the effects of such biases. Thus people's risk perceptions can be manipulated in the short run by selective presentations. People will not know and may not sense how much has been left out (11). What happens in the long run depends on whether the missing information is revealed by other experiences or sources.
People may disagree more about what "risk" is than about how large it is. One obstacle to determining what people know about specific risks is disagreement about the definition of "risk" (12-15). For some risk experts, the natural unit of risk is an increase in probability of death; for others, it is reduced life expectancy; for still others, it is the probability of death per unit of exposure. If lay people and risk managers use the term "risk" differently, they may agree on the facts of a hazard, but disagree about its riskiness.
How much does the public know and understand? The answer to this question depends on the risks consumers face and the opportunities they have to learn about them. The next section discusses strategies for improving those opportunities.

Main Article

References
  1. Fischhoff  B, Bostrom  A, Quadrel  MJ. Risk perception and communication. In: Detels R, McEwen J, Omenn G, editors. Oxford textbook of public health. London: Oxford University Press; 1997. p. 987-1002.
  2. Krimsky  S, Golding  D, eds. Social theories of risk. Westport (CT): Praeger; 1992.
  3. Slovic  P. Perception of risk. Science. 1987;236:2805. DOIPubMedGoogle Scholar
  4. Lammerding  AM. Quantitative microbial risk assessment. Emerg Infect Dis. 1997. In press.PubMedGoogle Scholar
  5. National Research Council. Improving risk communication. Washington (DC): National Academy Press; 1989.
  6. Carter  MW, ed. Radionuclides in the food chain. Berlin: Springer-Verlag; 1988.
  7. Kahneman  D, Slovic  P, Tversky  A, eds. Judgment under uncertainty: heuristics and biases. New York: Cambridge University Press; 1982.
  8. Gilovich  T, ed. How we know what isn't so. New York: Free Press; 1993.
  9. Hasher  L, Zacks  RT. Automatic and effortful processes in memory. J Exp Psychol Gen. 1984;108:35688. DOIGoogle Scholar
  10. Peterson  CR, Beach  LR. Man as an intuitive statistician. Psychol Bull. 1967;69:2946. DOIGoogle Scholar
  11. Fischhoff  B, Slovic  P, Lichtenstein  S. Fault trees: sensitivity of assessed failure probabilities to problem representation. J Exp Psychol Hum Percept Perform. 1978;4:33044. DOIGoogle Scholar
  12. Crouch  EAC, Wilson  R, eds. Risk/benefit analysis. Cambridge (MA): Ballinger; 1981.
  13. Fischhoff  B, Watson  S, Hope  C. Defining risk. Policy Sci. 1984;17:12339. DOIGoogle Scholar
  14. Slovic  P, Fischhoff  B, Lichtenstein  S. Rating the risks. Environment. 1979;21:1420, 36–9.
  15. Fischhoff  B, Svenson  O. Perceived risk of radionuclides: understanding public understanding. In: Harley JH, Schmidt GD, Silini G, editors. Radionuclides in the food chain. Berlin: Springer-Verlag; 1988. p. 453-71.
  16. Fischhoff  B. Risk perception and communication unplugged: twenty years of process. Risk Anal. 1995;15:13745. DOIPubMedGoogle Scholar
  17. Schriver  KA. Evaluating text quality: the continuum from text-focused to reader-focused methods. IEEE Trans Prof Commun. 1989;32:23855. DOIGoogle Scholar
  18. Bursztajn  HJ, Feinbloom  RI, Hamm  RA, Brodsky  A, eds. Medical choices, medical chances: how patients, families, and physicians can cope with uncertainty. New York: Routledge; 1990.
  19. Raiffa  H, ed. Decision analysis: introductory lectures on choices under uncertainty. Reading (MA): Addison-Wesley; 1968.
  20. Merz  J, Fischhoff  B, Mazur  DJ, Fischbeck  PS. Decision-analytic approach to developing standards of disclosure for medical informed consent. Journal of Toxics and Liability. 1993;15:191215.
  21. Burns  WJ, Clemen  RT. Covariance structure models and influence diagrams. Manage Sci. 1993;39:81634. DOIGoogle Scholar
  22. Howard  RA. Knowledge maps. Manage Sci. 1989;35:90322. DOIGoogle Scholar
  23. Fischhoff  B, Downs  J. Accentuate the relevant. Psychol Sci. 1997;8:1548. DOIGoogle Scholar
  24. Morgan  MG, Henrion  M, eds. Uncertainty. New York: Cambridge University Press; 1990.
  25. Poulton  EC, ed. Bias in quantifying judgment. Hillsdale (NJ): Lawrence Erlbaum; 1989.
  26. Bostrom  A, Fischhoff  B, Morgan  MG. Characterizing mental models of hazardous processes: a methodology and an application to radon. J Soc Issues. 1992;48:85100. DOIGoogle Scholar
  27. Bruhn  CM. Consumer education. Emerg Infect Dis. 1997. In press.PubMedGoogle Scholar
  28. Fischhoff  B, Riley  D, Kovacs  D, Small  M. What information belongs in a warning label? Psychol Mark. 1998. In press.
  29. Funtowicz  S, Ravetz  J, eds. Uncertainty and quality in science for policy. London: Kluwer; 1990.

Main Article

Page created: December 21, 2010
Page updated: December 21, 2010
Page reviewed: December 21, 2010
The conclusions, findings, and opinions expressed by authors contributing to this journal do not necessarily reflect the official position of the U.S. Department of Health and Human Services, the Public Health Service, the Centers for Disease Control and Prevention, or the authors' affiliated institutions. Use of trade names is for identification only and does not imply endorsement by any of the groups named above.
file_external