This article is part of an Essential Guide, our editor-selected collection of our best articles, videos and other content on this topic. Explore more in this guide:
3. - Preconference coverage : Read more in this section
- Radiology peer review held back by fragmented systems
- Mobile workflows, clinical decision support coming to radiology
- Radiologists preparing for stage 2 changes
Explore other sections in this guide:
Are two heads better than one when radiologists are making diagnostic decisions? Doctors are humans, so we are all prone to making mistakes. Even if you didn't read the 2000 Institute of Medicine report To Err is Human, we all know that humans make mistakes.
Humans are not machines that can be refined by Six Sigma methodologies to an error rate of 0.00034%. Doctors make mistakes. This is why they carry professional liability insurance. This is also why some patients pursue a second opinion from a different doctor. Getting a second opinion takes time and often costs extra money since most health insurance plans will not routinely cover second-opinion visits for every aspect of one's medical care.
In the world of radiology, patients can get copies of their X-ray, CT scan, MRI or other films and send them to another radiologist for a second opinion. In the past, this was very difficult since patients did not have easy access to their own radiology films. However, digital radiology technology has made it easier than ever for patients to transport their radiology films.
Digital radiology has also made it easier for radiologists to get a second opinion from their radiology colleagues. Once again, these formal or informal second-opinion interpretations may not always lead to reimbursement from health insurance plans, but they can lead to improved patient care if they are performed at the appropriate time. Previous research from the Radiological Society of North America (RSNA) shows that some radiologists consider peer-review controversial because of the variance between interpretations of what construes an "abnormal imaging finding." In effect, defining accuracy itself is subjective in some cases.
As U.S. radiologists convene for RSNA 2013 in Chicago, some of their Canadian counterparts in Ontario have already forged ahead in using digital platforms to facilitate radiology peer review. The Integrated Department of Diagnostic Services at Hamilton Health Sciences (HHS) and St. Joseph's Healthcare Hamilton (SJHH) are early adopters of using digital radiology technology in a framework that combines a radiology peer review and quality assurance platform working in real time.
In October, four hospitals in Canada started using this platform, called DiaShare, from Real Time Medical. The project demonstrates a cross-institutional, cross-system, prospective peer-review platform that has never been attempted before. In essence, these hospitals and radiologists are working collaboratively to catch human error and mistakes in diagnostic reporting.
With DiaShare, hospital executives are able to view the quality of radiology diagnostic reports in real time and measure changes and improvements over time. "The pilot that we've undertaken with Real Time Medical is a real game changer around the quality and safety agenda. Our experience with other systems we've used in the past is that they are static and retrospective," said David Wormald, integrated assistant vice president for diagnostic services at HHS and SJHH, in a written statement.
I can see how this type of collaboration can work in Canada. The approach is innovative -- but most importantly, the culture of the healthcare system in Canada permits this type of approach. This could also work in the United States if all the participating hospitals and radiologists were part of the same integrated system.
For example, I could see this happening within a delivery system like Kaiser Permanente, since the hospitals belong to a single network and all the radiologists are employed by the system. Or maybe this type of collaboration could work within the setting of an accountable care organization, where reimbursement is tied to quality of care, not fee for service. I could also see this working in the Veterans Administration health system. Two heads put together are better than one when you're actively trying to reduce human error.
Unfortunately, so many medical systems in the United States remain highly fragmented, and many are highly competitive. There is no way that some of these systems would ever consider a similar cooperative, collaborative approach like the one we're seeing in Canada.
If we are to improve care quality, we must to be willing to adopt a framework that provides a cross-system, prospective peer review platform. I think I would be more inclined to receive my care at a hospital where I know there are several layers of checks and balances to minimize human error rates even further.
Hospitals have come a long way in reducing medical errors ever since they starting instituting checklists, quality improvement initiatives and other forms of clinical-process error detection based on Six Sigma methodologies, but they still have room for improvement. Perhaps if the culture starts to shift in one department, such as in the radiology department, the rest of the organization will adopt that culture across other areas.
Joseph Kim is a physician technologist who has a passion for leveraging health IT to improve public health. Dr. Kim is the founder of NonClinicalJobs.com and is an active social media specialist. Let us know what you think about the story; email firstname.lastname@example.org or contact @SearchHealthIT on Twitter.