The most important element of the peer-learning program may be the peer-learning conference, in which cases are shared with the group and opportunities for improvement are discussed. Peer-learning conferences usually function better with specific ground rules, including a focus on learning, reliance on principles of just culture, and use of anonymized case discussion relying on evidence-based medicine.
Results of a national survey of radiologists that aims to assess improvement opportunities for current peer review systems was published online in advance of the March issue of the American Journal of Roentgenology. The article arrives at an opportune time for Strategic Radiology practices as the coalition embarks on a Peer Learning collaborative pilot program.
A 21-question multiple-choice questionnaire was developed by the American Roentgen Ray Society Quality Improvement subcommittee and emailed to 17,695 ARRS members, with two email reminders. A response rate of 4.2% (742) was achieved, with 547 responses meeting the inclusion criteria. Fifty percent of respondents were in private practices of between 11 and 50 radiologists; about 43% did not use peer review as a group education tool, but close to 68% were notified of their results. Less than half used RADPEER, the dominant solution, and the remainder used a variety of commercial software.
In opening their report, Lee et al note the widespread use among radiologists of peer review, “a system that uses accuracy as a surrogate market for competency.” Use of RADPEER counts toward American Board of Radiology MOC Part IV requirements and accreditation by the ACR is accepted by the Joint Commission requirement for ongoing professional performance evaluation (OPPE).
Yet evidence that peer review systems result in actual improvement and learning has not materialized as the systems themselves focus on radiologist error rates rather than continuous quality improvement. Problems associated with peer review systems include the awkwardness of rating practice members, potential bias, and threat to job security.
In addition to querying radiologists on what changes they’d like to see in peer review systems, the survey probed the status quo. The largest percentage of respondents (32.1%) reported that 10–20 cases per month are reviewed; 29.1% reported that >20 were reviewed; and 21.7% said <10 were reviewed.
A bit more than half all respondents (54.5%) reported satisfaction with their peer review system. Quick compliance (100%), ease of use (100%), and results are confidential and cannot be used against me (95.6%) were the three most common sources of satisfaction.
Slightly less than half (44.5%) of all respondents reported dissatisfaction with current systems, primarily because of the lack of learning opportunity (94%) and unfair or inaccurate assessment of performance (75.5%). About 75% of dissatisfied respondents cited a cumbersome system as a source of dissatisfaction. Embarrassment (57.2%) and punitive consequences 53.9%) were other sources of dissatisfaction.
Almost half of all respondents reported submission bias and under-reporting, with about the same number reporting that their practice discrepancy rates are significantly lower than the literature suggests.
Almost 75% of respondents reported that their practice cultural values support mistakes as learning opportunities. Yet only half (52.4%) reported that they feel valued and supported by practice members and even fewer (37.3%) reported a lack of complacency within the practice in that there is a continuous focus on outcomes improvement.
What do radiologists most want from a peer review system? The following features were described as “most desired”:
In their discussion of the results, the authors noted a great deal of variety in the way practices implement and manage the peer review process, including technology used and managing results. "The most important element of the peer-learning program may be the peer-learning conference, in which cases are shared with the group and opportunities for improvement are discussed," they wrote. "Peer-learning conferences usually function better with specific ground rules, including a focus on learning, reliance on principles of just culture, and use of anonymized case discussion relying on evidence-based medicine."
The authors suggest opportunities to standardize peer review nationally including determining a minimum number of cases to review and methods to communicate results to radiologists that protect against medico-legal discoverability.
Hub is the monthly newsletter published for the membership of Strategic Radiology practices. It includes coalition and practice news as well as news and commentary of interest to radiology professionals.
If you want to know more about Strategic Radiology, you are invited to subscribe to our monthly newsletter. Your email will not be shared.