Refining assessment: Rasch analysis in health professional education and research
Keywords:Rasch analysis, assessment, measurement, learner ability, scale development
Educators want to assess learners using assessment processes that provide valid measures of learner ability. An ideal assessment tool would include items that are appropriate for assessing the target attributes. Ideal assessment results would accurately differentiate learners across the spectrum of ability, determine which learners satisfied the required standard and enable comparison between learner cohorts (e.g., across different years). Similar considerations are relevant to researchers who are designing or revising methods used to gather other kinds of assessment data, such as participant responses to surveys or clinical measurements of performance. Analysing assessment scores using Rasch analysis provides information about scores and the nature of each assessment item, and analysis output guides refinement of assessment. However, few health professional educators have published research that includes Rasch modelling methods. It may be that health professional educators find the language used to describe Rasch analysis to be somewhat impenetrable and that this has, to date, limited engagement in exploring applications for Rasch. In this paper, we lay out an overview of the potential benefits of Rasch analysis in health professional education and research.
Andrich, D. (1985). An elaboration of Guttman scaling with Rasch models for measurement. Sociological Methodology, 15, 33–80. https://doi.org/10.2307/270846
Andrich, D. (2011). Rating scales and Rasch measurement. Expert Review of Pharmacoeconomics & Outcomes Research, 11(5), 571–585. https://doi.org/10.1586/erp.11.59
Baghaei, P. (2010). An investigation of the invariance of Rasch item and person measures in a c-test (pp. 100–112). In R. Grotjahn (Ed.), Der c-test: Beiträge aus der aktuellen forschung/the c-test: Contributions from current research. Peter Lang.
Bearman, M., Dawson, P., Bennett, S., Hall, M., Molloy, E., Boud, D., & Joughin, G. (2017). How university teachers design assessments: A cross-disciplinary study. Higher Education, 74(1), 49–64. https://doi.org/10.1007/s10734-016-0027-7
Bond, T., Yan, Z., & Heene, M. (2021). Applying the Rasch model: Fundamental measurement in the human sciences. Routledge. https://doi.org/10.4324/9780429030499
Boone, W. J., Staver, J. R., & Yale, M. S. (2014). Rasch analysis in the human sciences. Springer. https://doi.org/10.1007/978-94-007-6857-4
Dalton, M., Davidson, M., & Keating, J. (2011). The Assessment of Physiotherapy Practice (APP) is a valid measure of professional competence of physiotherapy students: A cross-sectional study with Rasch analysis. Journal of Physiotherapy, 57(4), 239–246. https://doi.org/10.1016/S1836-9553(11)70054-6
de Jong-Gierveld, J., & Kamphuls, F. (1985). The development of a Rasch-type loneliness scale. Applied Psychological Measurement, 9(3), 289–299. https://doi.org/10.1177/014662168500900307
Downing, S. M., & Yudkowsky, R. (2009). Assessment in health professions education. Taylor & Francis Group. https://doi.org/10.4324/9780203880135
Farlie, M. K., Keating, J. L., Molloy, E., Bowles, K.-A., Neave, B., Yamin, J., Weightman, J., Saber, K., & Haines, T. P. (2019). The balance intensity scales for therapists and exercisers measure balance exercise intensity in older adults: Initial validation using Rasch analysis. Physical Therapy, 99(10), 1394–1404. https://doi.org/10.1093/ptj/pzz092
Friesen, D., & Mousavi, A. (2020, May 30–June 4). Assessment literacy in higher education [Paper presentation]. 2020 Conference of the Canadian Society for the Study of Education, London, Ontario.
Gustafsson, J. (1980). Testing and obtaining fit of data to the Rasch model. British Journal of Mathematical and Statistical Psychology, 33(2), 205–233. https://doi.org/10.1111/j.2044-8317.1980.tb00609.x
Hagquist, C., Bruce, M., & Gustavsson, J. P. (2009). Using the Rasch model in nursing research: An introduction and illustrative example. International Journal of Nursing Studies, 46(3), 380–393. https://doi.org/10.1016/j.ijnurstu.2008.10.007
Homer, M., Darling, J., & Pell, G. (2012). Psychometric characteristics of integrated multi-specialty examinations: Ebel ratings and unidimensionality. Assessment & Evaluation in Higher Education, 37(7), 787–804. https://doi.org/10.1080/02602938.2011.573843
Iramaneerat, C., Yudkowsky, R., Myford, C. M., & Downing, S. M. (2008). Quality control of an OSCE using generalizability theory and many-faceted Rasch measurement. Advances in Health Sciences Education, 13(4), Article 479. https://doi.org/10.1007/s10459-007-9060-8
Johnson, C., Keating, J. L., Leech, M., Congdon, P., Kent, F., Farlie, M. K., & Molloy, E. (in press). Testing, analysis and refinement of the Feedback Quality Instrument: A guide for health professional educators in fostering learner-centred discussions. BMC Medical Education.
Kean, J., Brodke, D. S., Biber, J., & Gross, P. (2018). An introduction to item response theory and Rasch analysis of the Eating Assessment Tool (EAT-10). Brain Impairment: A Multidisciplinary Journal of the Australian Society for the Study of Brain Impairment, 19(Special Issue 1), 91–102. https://doi.org/10.1017/BrImp.2017.31
Linacre, J. M. (1994). Many-faceted Rasch measurement. MESA Press. https://www.winsteps.com/a/Linacre-MFRM-book.pdf
Merbitz, C., Morris, J., & Grip, J. C. (1989). Ordinal scales and foundations of misinference. Archives of Physical Medicine and Rehabilitation, 70(4), 308–312. https://www.researchgate.net/profile/Charles-Merbitz/publication/20619750_Ordinal_scale_and_foundations_of_misinference/links/0fcfd50059d8897a5c000000/Ordinal-scale-and-foundations-of-misinference.pdf
Messick, S. (1989). Validity (pp. 13–103). In R. L. Linn (Ed.), The American Council on Education/Macmillan series on higher education (3rd ed.). Macmillan and American Council on Education.
Mubuuke, A. G., Mwesigwa, C., & Kiguli, S. (2017). Implementing the Angoff method of standard setting using postgraduate students: Practical and affordable in resource-limited settings. African Journal of Health Professions Education, 9(4), 171–175. https://doi.org/10.7196/AJHPE.2017.v9i4.631
Norcini, J., Anderson, M. B., Bollela, V., Burch, V., Costa, M. J., Duvivier, R., Hays, R., Palacios Mackay, M. F., Roberts, T., & Swanson, D. (2018). 2018 Consensus framework for good assessment. Medical Teacher, 40(11), 1102–1109. https://doi.org/10.1080/0142159X.2018.1500016
OECD. (2009). The Rasch model. In PISA data analysis manual: SAS. https://doi.org/10.1787/9789264056251-6-en
Rasch, G. (1980). Probabilistic models for some intelligence and attainment tests (2nd ed.). The University of Chicago Press. https://doi.org/10.1177/014662168100500413
Robinson, M., Johnson, A. M., Walton, D. M., & MacDermid, J. C. (2019). A comparison of the polytomous Rasch analysis output of RUMM2030 and R (ltm/eRm/TAM/lordif). BMC Medical Research Methodology, 19, Article 36. https://doi.org/10.1186/s12874-019-0680-5
Royal, K. D., & Hedgpeth, M.-W. (2017). The prevalence of item construction flaws in medical school examinations and innovative recommendations for improvement. EMJ Innovations, 1(1), 61–66.
Sattelmayer, K. M., Jagadamma, K. C., Sattelmayer, F., Hilfiker, R., & Baer, G. (2020). The assessment of procedural skills in physiotherapy education: A measurement study using the Rasch model. Archives of Physiotherapy, 10, Article 9. https://doi.org/10.1186/s40945-020-00080-0
Scheuneman, J. D., & Subhiyah, R. G. (1998). Evidence for the validity of a Rasch model technique for identifying differential item functioning. Journal of Outcome Measurement, 2(1), 33–42.
Schuwirth, L. W. T., & Van der Vleuten, C. P. M. (2011). Programmatic assessment: From assessment of learning to assessment for learning. Medical Teacher, 33(6), 478–485. https://doi.org/10.3109/0142159X.2011.565828
Smith, E. V., Jr. (2001). Evidence for the reliability of measures and validity of measure interpretation: A Rasch measurement perspective. Journal of Applied Measurement, 2(3), 281–311.
Till, H., Myford, C., & Dowell, J. (2013). Improving student selection using multiple mini-interviews with multifaceted Rasch modeling. Academic Medicine, 88(2), 216–223. https://doi.org/10.1097/ACM.0b013e31827c0c5d
Wolfe, E. W., & Smith, E. V., Jr. (2007a). Instrument development tools and activities for measure validation using Rasch models: Part I—Instrument development tools. Journal of Applied Measurement, 8(1), 97–123.
Wolfe, E. W., & Smith, E. V., Jr. (2007b). Instrument development tools and activities for measure validation using Rasch models: Part II—Validation activities. Journal of Applied Measurement, 8(2), 204–234.
Wright, B. D. (1977). Solving measurement problems with the Rasch model. Journal of Educational Measurement, 14(2), 97–116. https://doi.org/10.1111/j.1745-3984.1977.tb00031.x
Wright, B. D., & Linacre, J. (1989). Observations are always ordinal: Measurements, however, must be interval. Archives of Physical Medicine and Rehabilitation, 70(12), 857–860.
Yang, S.-C., Tsou, M.-Y., Chen, E.-T., Chan, K.-H., & Chang, K.-Y. (2011). Statistical item analysis of the examination in anesthesiology for medical students using the Rasch model. Journal of the Chinese Medical Association, 74(3), 125–129. https://doi.org/10.1016/j.jcma.2011.01.027
How to Cite
On acceptance for publication in FoHPE the copyright of the manuscript is signed over to ANZAHPE, the publisher of FoHPE.
Any reproduction of material published in FoHPE must have the express permission of the publisher.
Articles published in Focus on Health Professional Education (FoHPE) are available under Creative Commons Attribution Non-Commercial No Derivatives Licence (CC BY-NC-ND 4.0).