A new era in residency training has begun. In July, diagnostic radiology became one of the first medical specialties to implement the Next Accreditation System (NAS), an outcomes-based educational system that by 2014 will cover all medical accreditation programs.
Favoring educational outcomes over process and function over form, NAS aims to recognize educational gaps in time to correct them, while giving residency programs real freedom to innovate. An article describing the new program was published online September 13 in the Journal of the American College of Radiology.
In development since 2009, NAS features a continuous accreditation process focusing on an annual review of a program's attrition and changes, scholarly activity, board-exam pass rates, clinical experience, a resident survey, a faculty survey, milestones data, and, less frequently, site visits called clinical learning environmental reviews (CLERs), wrote Dr. Brandi Nicholson, Dr. Angelisa Paladin, Dr. Sandra Oldham, and colleagues on behalf of the Association of Program Directors in Radiology's (APDR) residency structure committee.
The JACR article describes NAS structures and its new elements, including CLER, as well as milestones, faculty-based clinical competency committees (CCCs), and self-study visits.
"One of the main differences with the new system is that it provides more flexibility to program directors to be creative and innovative in how they teach the basics of radiology to our residents," said Nicholson, associate director of the residency program at the University of Virginia, in an interview with AuntMinnie.com. "In the past it was much more structured."
A farewell to PIF
The old accreditation system called for periodic reviews and the completion of a program information form (PIF), as well as a lengthy, all-encompassing annual site visit by field staff from the Accreditation Council for Graduate Medical Education (ACGME).
The PIF, prepared by the program director in advance of the site visit, helped ACGME evaluate training by assessing how well a program was meeting its common and specialty-specific requirements. Included in the ACGME documentation were demographic data about trainees, exam volumes, methods of evaluating residents or fellows, program evaluations submitted by trainees and faculty, and other specific information regarding the curriculum and educational environment.
However, the PIF process and the site visit are burdensome and inefficient, requiring large amounts of staff time to be able to present an enormous amount of data in a single visit, Nicholson said.
Shorter, more-frequent updates
In contrast, NAS handles the conveyance of this information in smaller chunks with more frequent updates. As a result, less time will be required during CLER site visits, which will occur only every 18 months -- and with as little as 10 days' notice. NAS aims to improve the peer-review system's ability to prepare residents for the practice environment by offering more flexibility, according to the study authors.
Under NAS, "if programs are demonstrating that they're excelling or are above average in teaching residents what they need to know about radiology and how to apply it in clinical care, and they can prove their residents are accomplishing the milestones required on each rotation ... as well as specifics such as knowing how to interpret a chest x-ray ... then programs have flexibility in how they teach the residents that information," Nicholson said. "They can institute creative educational tools in order to make those things happen."
The flexibility also requires that the goals be met for the program as a whole, not just by individual residents, Nicholson added. Beyond individual residents, the program as a whole must do well on tests, including the core exam and the physics exams.
Of course, many components of NAS will be similar to today's programs: program surveys, case log data, resident scholarly activity, and demographic data are all elements that will live on in the new system, the authors noted. Other elements such as CLER are novel and likely unfamiliar to many program directors.
Under NAS, for example, milestone assessment for each resident will be reported every six months, so it won't need to be conveyed as part of CLER visits.
Oversight on a CLER day
CLER is comprised of six elements related to education: patient safety, quality improvement, care transitions, promoting appropriate resident and fellow supervision, duty hour oversight and fatigue management, and enhanced professionalism. ACGME will use the outcome measurements from CLER to evaluate the program's quality and safety standards, the authors wrote.
A CLER evaluation includes the site visit, the ACGME CLER evaluation committee meeting, and support for faculty in areas it emphasizes, the authors wrote. At the moment, the CLER program is still in its infancy and being conducted only in a handful of institutions on a pilot basis. But the goal is to create a national database to provide further improvement in graduate medical education.
The CLER evaluation committee is different from the traditional ACGME review committee, and although they will work together, they will have different goals, the authors noted. While CLER focuses on its six elements, providing feedback to the institutional review committee (IRC), the specialty residency review committees (RRCs), and residency and fellowship programs, ACGME will continue to assess the ability of programs to adhere to established accreditation requirements.
The clinical competency committees, another new aspect of NAS, will consist of faculty members representing key subspecialties or participating institutions. CCCs will meet at least semiannually, working with ACGME to fulfill their role in the evaluation of residents.
Some efficiencies built into the new process will help pare preparation time among program directors, the authors noted. For example, ACGME will require curricula vitae only from program directors, and scholarly activity will be entered by means of PubMed numbers rather than in free-form style. Of course, conference presentations, posters, exhibitions, and book chapters still count as scholarly activity as well.
The annual resident survey was traditionally completed by a few trainees, but since 2012 all residents and fellows have been required to complete it, providing data on duty hours, faculty evaluations, etc. Beginning in January, "core" program faculty members are also required to complete a survey.
More objective milestones
Under NAS, residents' learning progress over time will be tracked using so-called milestones, a series of objective performance goals that are rooted in the six core competencies (patient care, practice-based learning, professionalism, etc.) that all residents are familiar with. The milestones are a key part of ACGME's plan to move toward a learner-centered, outcomes-based evaluation of training programs, according to the authors.
Milestones replace a far more subjective system for evaluating residents. The previous system had seemed capricious and hopelessly subjective, often based on Likert-style scales of 1 to 5 and a written evaluation that together often failed to discern if residents really had the skills they needed.
"You were either excellent or above average, and most of the time residents were not identified as failing or below average," Nicholson said. "So by putting more objective criteria in the milestone aspect, it's very difficult to slant yourself toward the top of the scale. It's not subjective, it's, 'Here's a skill: Can you do it or not? And if you can't, let's get you to the point where you can.' "
The result is more reproducible, more objective, and fairer to everyone, according to Nicholson.
Freedom and flexibility are other pluses of the milestone approach, the authors noted. For example, the specific time when each level can be achieved can vary, depending on the milestone and the program structure.
Fellowship milestones for diagnostic radiology are still being developed and will go into effect by July 2014.
More demands on staff and trainees?
As NAS has unfolded in residency programs, some program directors expressed frustration with what can seem like a daunting list of new requirements. Will new efficiencies really leave everyone better off, or will the new regime, however improved, end up taking more time and effort than the old evaluation system?
"I think the process will require additional time input, and those who are passionate and involved in CCCs are going to embrace it and put in the additional time, and we're going to see improvement in how we evaluate residents," Nicholson said. "Hopefully, the ability to have flexibility in educational initiatives and freedom would be what you get out of putting in the extra work."
Taken together, educator resources like journal articles, the RSNA meeting, and the Association of University Radiologists (AUR) meeting will all contribute toward better understanding of the new requirements among educators, especially in these early years of the new program, Nicholson said.
"Overall, it may not require as much time as people fear it will because there will be plenty of resources out there and people [residents] can talk to to make this process as efficient as possible," she said. "And I think everything gets better with time and practice. So yes, it will be more time-intensive and a little harder in the first year or two, but as we figure out how to do it best, it will end up being not too burdensome."