Used to be an automobile dashboard consisted of a few simple, circular instrument displays -- a gas gauge, a speedometer, an engine heat sensor. Nowadays, drivers get detailed information about how their car is performing via the dashboard panel, which can include everything from internal temperature controls and miles driven per trip to global positioning systems and, if it's a hybrid, electricity-to-gas ratios.
But dashboards aren't just for cars anymore. Hospitals can create their own information matrix to assess performance and determine areas that need to be improved within any department via a "dashboard"-- a computer program that gathers information about the department and shapes it into reports, according to Joni Schott, business operations manager at the University of Wisconsin Hospital and Clinics in Madison. Schott offered an overview of how hospitals can create a radiology dashboard at the 2007 American Healthcare Radiology Administrators (AHRA) meeting in Orlando, FL.
Information included in the dashboard can be manipulated in a number of ways, from the visual style of a chart to a "scorecard" that lists performance objectives related to an area of focus and lays out the actual achievements as compared to goals.
"A dashboard allows hospital departments to track data in visual form," Schott said. "And it consists of scorecards that give staff an overview of how the department is doing in particular areas -- a timely and accurate snapshot of overall performance."
Developing the dashboard
Putting together a dashboard that will allow the department to track its progress toward its goals and communicate that information to its staff via scorecard reports takes time, Schott said. But it can break down to discrete phases. In phase one, the group developing the dashboard (which at the University of Wisconsin included two people from the department's decision support team, the quality and education manager, Schott, an administrative assistant, and the director of radiology) defines and plans its scope; in phase two, the group designs the dashboard; in phase three, the group builds it; and in phase four, the group introduces the tool to the department and trains its users.
In phase one, identify the project leads and contributors, set a timeframe for completing the dashboard -- three months or less is best, according to Schott -- set target dates for each step, and identify the audience for the tool. Prepare to conduct a SWOT (strengths, weaknesses, opportunities, and threats) analysis of the department, and select the key issues that will be reflected in the dashboard and assessed via the scorecard.
A good scorecard consists of the following characteristics, Schott said:
- It is outcome-focused and linked to objectives.
- It is balanced along all perspectives.
- They're mixed between leading and lagging indicators.
- It has targets associated with the facility's objectives, and the targets are set to trigger action.
- Data is dynamic, not stale.
- Responsible parties are identified.
- Scorecard results are communicated to those who can effect change.
"A dashboard can really communicate progress and empower department workgroups to make improvements," Schott said. "It also provides opportunities to integrate a department's efforts with quality and IT departments."
Schott and the team did some brainstorming to identify issues to include in the dashboard, and ended up with the following: patient satisfaction, employee growth and development, clinical effectiveness, quality, safety, market position, operational efficiency, and financial health.
Establishing responsibility
In the second phase of the dashboard development process, Schott and her team asked each modality manager within the department to conduct a SWOT analysis for his or her area. They then carefully went over the list of key measures and objectives, ensuring that they aligned with their organization's overall goals. They set criteria for "red lines" -- the baseline values that would trigger "variance reports," or alerts that performance in a particular area is lagging and needs to be addressed.
"We wanted our dashboard to help us improve our own internal scores, rather than to compare our performance against another facility," Schott said.
A key component to this phase was figuring out who exactly would be responsible for acting on these variance reports for particular measures and identifying where the data for the dashboard would be generated (Did it already exist, or not? How often would data be reported?). Schott developed a data worksheet to help her team gather information to be included in the dashboard.
In phase three of the dashboard development process, the team worked with their facility's IT department to set up user accounts, enter historical and current data, and set up a variance report process. Phases four and five covered training for department staff and beginning to use the dashboard to assess the department's level of professionalism and the efficiency of its initiatives, as well as to welcome new ideas into the dashboard project.
Schott's advice to administrators building their own radiology dashboard? Automate all possible manual data entry, commit to doing the project within a concrete time window, and, most importantly, build it right the first time.
"You can always adjust the data, but it's harder to change the framework," Schott said. "Do it right at the outset and you'll find it's more successful."
By Kate Madden Yee
AuntMinnie.com staff writer
September 10, 2007
Related Reading
Study finds radiologists happy with their work -- sort of, August 3, 2007
Achieving the Zen of PACS administration, July 23, 2007
Boost revenue by tapping an untapped resource: Doctors, July 20, 2007
The 12 habits of highly successful radiology groups, June 4, 2007
Governance means enabling practice policy, not coddling personalities, March 2, 2007
Copyright © 2007 AuntMinnie.com