Dangers when Doctors Ignore Science
In the 1980s, researchers began raising alarms about the lack of current scientific knowledge among practitioners in a business clearly based on science: medicine. Although hundreds of medical studies were published every year to update our understanding of the human body, only a small percentage of doctors were consulting that literature to help patients. The sheer number of studies was a major barrier at the time, when they were primarily found only in paper journals. But even now that this knowledge is digital and easily searchable, most doctors fail to consult it.
As a result, less effective or ineffective treatments continue to be used in the United States, harming patients and wasting millions of health care dollars every year. For example, The New York Times reported in 2018 that meta-analyses show heart stents do not decrease heart attacks or deaths any better than medical treatment, or reduce pain better than faked procedures. (A “meta-analysis” crunches the numbers from a bunch of studies as if they were one study.) The stent procedure costs $10,000 and causes bleeding emergencies in some cases. Yet stents continue to be installed routinely.
The danger of relying on one’s current knowledge was highlighted by a review of 100 errors in diagnosis by internists, which revealed that 74% involved mental mistakes. That study report says, “the failure to continue considering reasonable alternatives after an initial diagnosis was reached was the single most common cause.”
In 1991 a medical school apparently coined the term “evidence-based practice” (EBP) for a way to narrow the gap between what medical researchers know and what practitioners do. Two business professors report EBP has “the goal of promoting the more systematic use of scientific evidence in physician education and clinical practice… It arose out of recognition that physicians had tended to prioritize tradition and personal experience (over scientific evidence), giving rise to troubling variation in treatment quality.” The EBP concept has spread to fields as varied as conservation, psychotherapy, social work, nonprofits, and fitness—and played a key role in the TV show House, the professors note, where a team of diagnostic specialists regularly went through research studies to find answers for difficult cases.
The EBP movement in medicine led to development of easier search methods; training for doctors and nurses on using EBP; inclusion in some medical school programs; and most successfully, the development of checklists for specific conditions based on best practices. Though EBP remains under-used, there is growing awareness and institutional adoption, driven by the desire to reduce costs and legal risks.
The Gap Spreads to Management
Management researchers realize the same gap exists in their realm. Stanford professors published an oft-cited book called The Knowing-Doing Gap in 2000.
“It is hardly news that many organizations do not implement practices that research has shown to be positively associated with employee productivity and firm financial performance…” wrote one team of scientists. “The gap between science and practice is so persistent and pervasive that some have despaired of its ever being narrowed.”
Writing in 2011, two industrial/organizational (IO) psychologists said, “In the last 2 decades, considerable soul searching by management scholars over the research–practice ‘gap’ or ‘divide’ has raised difficult questions about why the gap exists, how to bridge it, and the value and purpose of management research itself.” Put another way, if managers are not learning and applying this knowledge to get better results, what is the point to the research?
In a large study of human resources leaders, researchers found “our results suggest that there are in fact very large differences across companies in what their HR leaders know about best practices in HR and, furthermore, that the average level of knowledge does not appear to be very impressive.” One explanation was the sources these leaders accessed. HR Magazine, a trade publication with limited science reporting, was the only periodical their respondents read “usually” out of 17 research and trade sources listed. Fewer than 1% of the leaders “usually” read the three HR-relevant scientific journals on the list. Leaders rarely read Harvard Business Review, a blend of science and expert opinion. Respondents mostly turned to other HR people in their organization to get help with HR problems, followed closely by the Society for Human Resource Management and other Web sites, much of which present opinion more than scientific facts.
One guide on using evidence in business reports a study “gathered 80,000 expert predictions and compared them to what actually happened. The results were devastating. Academics, government officials, journalists and other pundits performed worse than ‘dart-throwing monkeys’ in forecasting the future. Indeed, those specialists who had more detailed subject knowledge seemed to perform even worse than average.”
The problem spawned a corollary to EBP in the business world:
“Evidence-based management (EBM) is about making decisions through the conscientious, explicit, and judicious use of four sources of information: practitioner expertise and judgment, evidence from the local context, a critical evaluation of the best available research evidence, and the perspectives of those people who might be affected by the decision” [italics added].
Notice that scientific evidence isn’t the only source. Most of us use evidence, drawn from things we read; advice from peers, classes or conferences; and our own successes and failures. However, failure to use good-quality evidence, and to recognize nonrational forces at work when we evaluate the remaining “anecdotal” evidence, lead to worse outcomes.
EBM thus calls for a measured approach to making critical decisions in which managers and consultants review scientific literature as a starting point, prioritizing well-designed studies that cover the entire body of studies related to a particular topic. The quantity and quality of evidence from each source must be assessed in choosing a direction. In some circumstances lesser-quality evidence might have to be given greater weight. For example, you might choose a process based on safety concerns despite solid evidence saying an alternative is more efficient.
Evidence for EBM
Although thinkers from Aristotle to Stephen Hawking have extolled the virtue of making decisions based on evidence, it is fair to ask if there is any evidence supporting the use of EBM. The academic Center for Evidence Based Management says research indicates:
- “Forecasts or risk assessments based on the aggregated (averaged) professional experience of many people are more accurate than forecasts based on one person’s personal experience (provided that the forecasts are made independently before being combined).
- “Professional judgments based on hard data or statistical models are more accurate than judgments based on individual experience.
- “Knowledge derived from scientific research is more accurate than the opinions of experts.
- “A decision based on the combination of critically appraised evidence from multiple sources yields better outcomes than a decision based on a single source of evidence.
- “Evaluating the outcome of a decision has been found to improve both organizational learning and performance, especially in novel and non-routine situations.”
Combining Knowledge and Practice
None of these comments are meant to suggest that articles by consultants, managers, or journalists have no value—this is one, after all! You have to rely on those if there is no harder evidence, and even when there is, experience is needed to determine when and how to apply the results. Personal stories can be very helpful to convincing doubters, too, by putting a human face on the data. However, if there is a way proven by objective evidence to prevent or reduce a management problem you are facing, and you don’t try to learn what that way is, this choice is part of the problem. Why not invest a small portion of the time you currently waste on the problem into learning what we really know about it before choosing your next fix?
Please share this post at the bottom of the page.
Note: This is the first in a series of posts about EBM. In the last, I’ll show you how to do it!
 Aaron. E. Carroll, “Heart Stents Are Useless for Most Stable Patients. They’re Still Widely Used.,” The New York Times, February 12, 2018, sec. The Upshot, https://www.nytimes.com/2018/02/12/upshot/heart-stents-are-useless-for-most-stable-patients-theyre-still-widely-used.html.
 Mark L. Graber, Nancy Franklin, and Ruthanna Gordon, “Diagnostic Error in Internal Medicine,” Archives of Internal Medicine 165, no. 13 (July 11, 2005): 1493–99, https://doi.org/10.1001/archinte.165.13.1493.
 Edward J. Mullen, Sarah E. Bledsoe, and Jennifer L. Bellamy, “Implementing Evidence-Based Social Work Practice,” Research on Social Work Practice 18, no. 4 (July 2008): 325–38, https://doi.org/10.1177/1049731506297827..
 Rob B. Briner and Denise M. Rousseau, “Evidence-Based I–O Psychology: Not There Yet,” Industrial and Organizational Psychology 4, no. 01 (March 2011): 3–22, https://doi.org/10.1111/j.1754-9434.2010.01287.x.
 Sara L. Rynes, Tamara L. Giluk, and Kenneth G. Brown, “The Very Separate Worlds of Academic and Practitioner Periodicals in Human Resource Management: Implications for Evidence-Based Management,” Academy of Management Journal 50, no. 5 (2007): 987–1008.
 Briner and Rousseau, “Evidence-Based I–O Psychology.”
 Sara L. Rynes, Amy E. Colbert, and Kenneth G. Brown, “HR Professionals’ Beliefs about Effective Human Resource Practices: Correspondence between Research and Practice,” Human Resource Management 41, no. 2 (2002): 149–74, https://doi.org/10.1002/hrm.10029.
 Nesta and Alliance for Useful Evidence, “Using Research Evidence for Success: A Practice Guide” (Nesta/Alliance for Useful Evidence), accessed March 21, 2018, https://www.alliance4usefulevidence.org/assets/Using-Research-Evidence-for-Success-A-Practice-Guide-v6-web.pdf.
 Rob B. Briner, David Denyer, and Denise M. Rousseau, “Evidence-Based Management: Concept Cleanup Time?,” Academy of Management Perspectives 23, no. 4 (2009): 19–32.
 Listed below.
 Denise M. Rousseau and Brian C. Gunia, “Evidence Based Practice: The Psychology of EBP Implementation,” Draft, Annual Review of Psychology, 2015, https://www.cebma.org/wp-content/uploads/Rousseau-Gunia-ARP-2015.pdf.
 Eric Barends, Denise M. Rousseau, and Rob B. Briner, “Evidence-Based Management: The Basic Principles” (Amsterdam, The Netherlands: Center for Evidence Based Management, 2014).
- Eric Barends, Denise M. Rousseau, and Rob B. Briner, “Evidence-Based Management: The Basic Principles” (Amsterdam, The Netherlands: Center for Evidence Based Management, 2014).
- Briner and Rousseau, op cit.
- Anthony R. Kovner, “Evidence-Based Management: Implications for Nonprofit Organizations: Evidence-Based Management,” Nonprofit Management and Leadership 24, no. 3 (March 2014): 417–24, https://doi.org/10.1002/nml.21097.
- Edward E. Lawler, “Why HR Practices Are Not Evidence-Based,” Academy of Management Journal 50, no. 5 (2007): 1033–1036.
- Mullen, Bledsoe, and Bellamy, op cit.
- Rynes, Giluk, and Brown, “The Very Separate Worlds of Academic and Practitioner Periodicals in Human Resource Management.”