I started to cry. What a shock! It was the first time I was officially evaluated since I made partner at my first group, and I did not like it -- not at all. I knew I was good, a hard worker, and a team player. Doctors work hard, and we do a good job; we are leaders who do good and heal. That was more than 10 years ago, and how the landscape has changed.
We are all constantly evaluated regarding our productivity, accuracy, and customer satisfaction. It is impossible to be at the top across the board, all the time. It was a kinder, gentler world when I grew up in medicine, a world without data, where doing an excellent job was judged by your colleagues without attention to "metrics." Maybe it wasn't always good, but a temper tantrum by a doctor was excused as brilliance under stress. Seldom were doctors reported for bad behavior, and that was good -- right?
Well, maybe not. When the team is afraid of us, we can't really be effective. If someone is afraid to tell you that you are about to make a mistake, and you do harm, we all lose!1
So how do we do it better? How do we allow our egos to step back and hear what we need to improve our practice?
Recently, I thought that if we did a survey to better understand the referral patterns of the doctors who refer to our department, maybe we could gain some insight into how we can do better: the hospital, the technologists, registration, and -- closest to my heart -- the radiologists.
Well, we got more than 100 responses, which was not too bad. From the feedback, we learned that access to service was a major issue, but the department's head of operations doubted the findings, saying, "This is not true." Yet 87% of physicians who responded said they felt it was a problem. The radiologists loved hearing that they were smart and engaging, and they denied that they were perceived as unavailable. Surprised?
The survey is a powerful tool if used well, and our own board suggests that it be used as a project for our maintenance of certification (MOC).2 Denial of the results is all too easy, but it's not useful! We often hear only what we want to hear, and opportunities for improvement are ignored because, as doctors, we already know we are doing our best, and that is more than enough.
The Joint Commission has instituted a process for the ongoing evaluation of all physicians, called Ongoing Professional Practice Evaluation (OPPE), which has been adopted across hospitals nationwide. Yet how many of us pay attention? We have to start paying attention because the metrics are there, and they're being used to evaluate us whether we are aware or not.
So how can we make this meaningful and use it to drive quality in our work? There are six competencies we must address:
- Patient care
- Medical/clinical knowledge
- Practice-based learning and improvements
- Interpersonal and communication skills
- System-based practice
- Professionalism
The American Board of Radiology (ABR) addresses each of these as well,3 but how many of us really pay attention?
What should be examined, not only to rubber stamp the necessary paperwork, but to really drive quality? The process should be designed to identity star performers, not just poor performers. How do we do this -- and do it fairly? How do we make it work?
We have no time to add another layer of busywork to our day, so the challenge is to develop and use a data-driven system, with automatic data feeds to gather information. The information must be analyzed and some sort of meaningful conclusion reached -- all before this is fed to the committee responsible for granting our hospital privileges.
Having a robust automated tool to continually monitor us would go a long way toward creating meaningful evaluations of our work, and it would let us show how much we really contribute to our community. What we create can be used to help us prove how good we are. We can drive this process if we are proactive and create meaningful ways to evaluate ourselves.
Case review is only one piece of this puzzle and only one way to perform peer review; reviewing significant events or perception data can go into a thorough and complete peer review.4 We have to take it seriously because if we don't, others will drive our evaluations. Radiologists need to work together nationally to make sure that the evaluation tools are appropriate, and then we need to use this information to improve our care delivery. The insurance companies are already evaluating us, and we need to be ahead of them.
Engagement is key -- and also very hard to get from radiologists. Objectivity is also very important. Peer review can be used as a weapon if it isn't done objectively; under a microscope, many flaws can be seen. Who reviews the peer review results? What is fair? Culturally, we don't like to do peer review, even when it is easy to do.5 We don't like to "waste time" or to "correct a peer."
Accurate data are also essential, and once we have robust metrics, they can be used by the hospital as a report card, both for us and our department.
How are we doing? We can say, "We are fantastic!" And we can prove it.
References
- Frankel AS, Leonard MW, Denham, CR. Fair and just culture, team behavior, and leadership engagement: The tools to achieve high reliability. Health Serv Res. 2006;41(4 Pt 2):1690-1709. doi: 10.1111/j.1475-6773.2006.00572.x.
- PQI project guidelines and templates. American Board of Radiology website. http://theabr.org/moc-vir-pqiguides. Accessed April 4, 2014.
- Risk management, communication, professionalism: ABR perspective (March 21, 2012). American Board of Radiology website. http://www.theabr.org/sites/all/themes/abr-media/doc/Becker-Bosma%202012%20AUR%20presentation.pdf. Accessed April 4, 2014.
- Kaewlai R, Abujudeh H. Peer review in clinical radiology practice. Am J Roentgenol. 2012;199(2):W158-W162. doi: 10.2214/AJR.11.8143.
- Keen C. Automated radiologist peer review offers workflow benefits. AuntMinnie.com. July 3, 2012. http://www.auntminnie.com/index.aspx?sec=sup&sub=ris&pag=dis&ItemID=99854. Accessed April 4, 2014.
Dr. Mary Morrison Saltz is a board-certified diagnostic radiologist. She is currently chief clinical integration officer for community practice initiatives at Stony Brook Medicine and a member of their department of radiology. She is also CEO of Imagine Image Innovation, a company that uses big data to improve delivery of radiology services.
She has also served as chief medical officer for Hospital Radiology Partners and as radiology chair and service chief at hospitals in Florida, Ohio, and Georgia. Dr. Saltz is a member of the American College of Physician Executives, the American College of Radiology, and RSNA, and she sits on the Citizens Advisory Council of the Duke Cancer Institute.
A hands-on leader, Dr. Saltz's expertise is working hand-in-hand with hospital administration to guide radiology teams to success. Dr. Saltz has led quality assurance programs in Florida and Ohio and she served as chief quality officer for community practice initiatives at Emory Healthcare. She also has more than 20 years of private-practice experience.
She is a graduate of McGill University, with a Bachelor of Science in Human Genetics, and Duke University, where she obtained her Doctor of Medicine. Her postgraduate education included a residency at Boston University, where she served as chief resident, and a fellowship in interventional abdominal radiology at Massachusetts General Hospital.
The comments and observations expressed herein do not necessarily reflect the opinions of AuntMinnie.com.