Assessment of radiology physicians by a regulatory authority

Radiology. 2008 Jun;247(3):771-8. doi: 10.1148/radiol.2473071431. Epub 2008 Mar 28.

Abstract

Purpose: To determine whether it is possible to develop a feasible, valid, and reliable multisource feedback program for radiologists.

Materials and methods: Surveys with 38, 29, and 20 items were developed to assess individual radiologists by eight radiologic colleagues (peers), eight referring physicians, and eight co-workers (eg, technicians), respectively, by using five-point scales along with an "unable to assess" category. Radiologists completed a self-assessment on the basis of the peer questionnaire. Items addressed key competencies related to clinical competence, collegiality, professionalism, workplace behavior, and self-management. The study was approved by the University of Calgary Conjoint Health Ethics Research Board.

Results: Data from 190 radiologists were available. The mean numbers of respondents per physician were 7.5 of eight (1259 of 1520, 83%), 7.15 of eight (1337 of 1520, 88%), and 7.5 of eight (1420 of 1520, 93%) for peers, referring physicians, and co-workers, respectively. The internal consistency reliability indicated all instruments had a Cronbach alpha of more than 0.95. The generalizability coefficient analysis indicated that the peer, referring physicians, and co-worker instruments achieved a generalizability coefficient of 0.88, 0.79, and 0.87, respectively. The factor analysis indicated that four factors on the colleague questionnaire accounted for 70% of the total variance: clinical competence, collegiality, professional development, and workplace behavior. For the referring physician survey, three factors accounted for 64.1% of the variance: professional development, professional consultation, and professional responsibility. Two factors on the co-worker questionnaire accounted for 63.2% of the total variance: professional responsibility and patient interaction.

Conclusion: The psychometric examination of the data suggests that the instruments developed to assess radiologists are a feasible way to assess radiology practice and provide evidence for validity and reliability.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Clinical Competence*
  • Communication
  • Factor Analysis, Statistical
  • Feasibility Studies
  • Humans
  • Peer Review, Health Care*
  • Physicians
  • Psychometrics
  • Radiology / standards*
  • Reproducibility of Results
  • Self-Assessment*
  • Surveys and Questionnaires