Open Access Open Access  Restricted Access Subscription Access

Discussion Paper: Social accountability for students in a machine learning era

Logan Zane John Williams, Rebecca Grainger

Abstract


Over the last 30 years, there have been repeated calls to integrate health informatics into undergraduate health professional curricula, in recognition of the integral role computing plays in medicine. The rise of big data sets in health, and the application of advanced computer algorithms to interrogate these, is yet another call for health professionals to receive appropriate training in these technologies. 

Machine learning (ML) algorithms can learn tasks or make decisions without a requirement for specific behaviours to be pre-programmed. High-impact literature has described ML approaches to clinical problems such as achieving more accurate and timely diagnoses, increasing precision of prognosis and guiding treatment. Despite the promise of ML in healthcare, there are risks of adverse outcomes, unanticipated consequences, misuse and even abuse of ML technologies. For health professionals to advocate for patients and hold those developing ML algorithms in healthcare accountable, they must feel comfortable discussing the fundamental concepts and limitations of ML in healthcare. 

Healthcare professionals are uniquely positioned to identify problems that could be solved by ML and related technologies. Yet, there is inadequate coverage of ML, or of the wider field of health informatics, in most medical curricula. To create future health professionals who can advocate for positive change and ensure that patients remain at the centre of ML applications in healthcare, we must provide future health professionals with an understanding of how ML will change healthcare delivery and the doctor–patient dynamic, as well as new ethical challenges that arise with the digital healthcare revolution.


Keywords


machine learning; social accountability; medical education; curriculum

Full Text:

PDF

References


Edirippulige, S., Brooks, P., Carati, C., Wade, V. A., Smith, A. C., Wickramasinghe, S., & Armfield, N. R. (2018). It’s important, but not important enough: eHealth as a curriculum priority in medical education in Australia. Journal of Telemedicine and Telecare, 24(10), 697–702. https://doi.org/10.1177/1357633X18793282

Kalis, B., Collier, M., & Fu, R. (2018). 10 promising AI applications in health care. Harvard Business Review. https://hbr.org/2018/05/10-promising-ai-applications-in-health-care

Morton, C. E., Smith, S. F., Lwin, T., George, M., & Williams, M. (2019). Computer programming: Should medical students be learning it? JMIR Medical Education, 5(1), e11940–e11940. https://doi.org/10.2196/11940

National Health Service (NHS). (2019). The Topol review: Preparing the healthcare workforce to deliver the digital future. https://topol.hee.nhs.uk/wp-content/uploads/ HEE-Topol-Review-2019.pdf

Rourke, J. (2018). Social accountability: A framework for medical schools to improve the health of the populations they serve. Academic Medicine, 93(8), 1120–1124.

https://doi.org/10.1097/ACM.0000000000002239

Shah, H. (2017). The DeepMind debacle demands dialogue on data. Nature News, 547(7663), 259.

Topol, E. J. (2019). High-performance medicine: The convergence of human and artificial intelligence. Nature Medicine, 25(1), 44–56. https://doi.org/10.1038/ s41591-018-0300-7

Walpole, S., Taylor, P., & Banerjee, A. (2016). Health informatics in UK medical education: An online survey of current practice. JRSM Open, 8(1). https://doi. org/10.1177/2054270416682674

Wolf, J. A., Moreau, J. F., Akilov, O., Patton, T., English, J. C., III, Ho, J., & Ferris, L. K. (2013). Diagnostic inaccuracy of smartphone applications for melanoma detection. JAMA Dermatology, 149(4), 422–426. https://doi.org/10.1001/jamadermatol.2013.2382

Wyatt, J. C., & Liu, J. L. Y. (2002). Basic concepts in medical informatics. Journal of Epidemiology and Community Health, 56(11), 808–812. https://doi.org/10.1136/jech.56.11.808




DOI: https://doi.org/10.11157/fohpe.v21i1.363

Refbacks

  • There are currently no refbacks.