Adaptation of Direct Observation of Procedural Skills (DOPS) for assessments in podiatry
DOI:
https://doi.org/10.11157/fohpe.v19i1.198Keywords:
Direct Observation of Procedural Skills, workplace-based assessment, podiatryAbstract
Background: The Direct Observation of Procedural Skills (DOPS) is a workplace-based assessment tool widely used in medicine to assess a learner’s ability to execute a technical skill. The aim of this paper is to report on the development phase of the adaptation of the DOPS for the assessment of podiatry learners’ procedural skills. Podiatry learners are required to practise and demonstrate a variety of procedural skills in the management of foot complaints. Such skills include the use of scalpel blades, needles and local anaesthetic applied to a variety of disorders. The DOPS provides an avenue by which a learner’s procedural skills can be assessed and timely feedback provided in the workplace or in simulated environments.
Methods: The DOPS was initially adapted for podiatry by a faculty team consisting of a podiatry educator, a clinical education specialist and a clinical educator from another allied health discipline. The first iteration was circulated among podiatry faculty at three other Australian universities. The second iteration was reviewed by clinical supervisors from Southern Cross University (SCU). The third iteration was administered by two clinical supervisors at SCU working with 12 learners during real-time clinical events. Eleven learners used DOPS to assess their peers during five real-time and six simulated learning events.
Results: A new tool, the Direct Observation of Procedural Skills in Podiatry (DOPS-P) has emerged from this process. Face and construct validity have been confirmed, and faculty and students consider DOPS-P contributes to learning.
Conclusions: Further research is necessary to confirm the validity and reliability of the DOPS-P to support assessment decisions about students’ achievement of podiatry competencies.
References
Ahmed, K., Miskovic, D., Darzi, A., Athanasiou, T., & Hanna, G. B. (2011). Observational tools for assessment of procedural skills: A systematic review. The American Journal of Surgery, 202(4), 469–480. doi:10.1016/j.amjsurg.2010.10.020
Barton, J. R., Corbett, S., van der Vleuten, C. P., & Programme, E. B. C. S. (2012). The validity and reliability of a Direct Observation of Procedural Skills assessment tool: Assessing colonoscopic skills of senior endoscopists. Gastrointestinal Endoscopy, 75(3), 591–597.
Burnand, H., Fysh, T., Wheeler, J., & Allum, W. (2014). Feedback and performance scores for Direct Observation of Procedural Skills. The Bulletin of the Royal College of Surgeons of England, 96(7), e5–e8.
Crossley, J., Johnson, G., Booth, J., & Wade, W. (2011). Good questions, good answers: Construct alignment improves the performance of workplace‐based assessment scales. Medical Education, 45(6), 560–569.
Naeem, N. (2013). Validity, reliability, feasibility, acceptability and educational impact of Direct Observation of Procedural Skills (DOPS). Journal of the College of Physicians and Surgeons Pakistan, 23(1), 77–82.
Norcini, J., & Burch, V. (2007). Workplace-based assessment as an educational tool: AMEE Guide No. 31. Medical Teacher, 29(9), 855–871.
Royal Australasian College of Physicans. (n.d.). RACP Direct Observation of Procedural Skills (DOPS) assessment guide: Cardiology cardiac catheterisation. Retrieved from https://www.racp.edu.au/docs/default-source/default-document-library/dopscardiology-cardiac-catheterisation-assessment-guide.pdf
Schuwirth, L. W., & van der Vleuten, C. P. (2010). How to design a useful test: The principles of assessment. In T. Stanwick (Ed.), Understanding medical education: Evidence, theory and practice (pp. 241–254). Oxford, England: John Wiley & Sons.
van der Vleuten, C. P., & Schuwirth, L. W. (2005). Assessing professional competence: From methods to programmes. Medical Education, 39(3), 309–317.
Wilkinson, J. R., Crossley, J. G., Wragg, A., Mills, P., Cowan, G., & Wade, W. (2008). Implementing workplace‐based assessment across the medical specialties in the United Kingdom. Medical Education, 42(4), 364–373.
Downloads
Additional Files
Published
How to Cite
Issue
Section
License
On acceptance for publication in FoHPE the copyright of the manuscript is signed over to ANZAHPE, the publisher of FoHPE.