What I Learned About Education Research by Reviewing >250 Manuscripts on Education
Research in education is increasingly common and plays an important role in the development of our specialty. This increasing interest is principally driven by two factors: 1) a desire to establish evidence-based best practices in education;1 and, 2) an opportunity for clinician educators to produce scholarship as part of a promotion pathway. Despite advancements in technology, medical students and residents spend the majority of their time training to be doctors in roughly the same way as they have been for the past 100 years.2 The importance of developing evidence to support new techniques and practices (especially when these new techniques are time-intensive or more expensive than traditional approaches) is self-evident.1 Unfortunately, the methods of many education researchers are not distinguishable from those of stamp collectors. Having read, reviewed, and edited no fewer than 250 manuscripts over the last decade, I offer the following ten suggestions to those interested in a career that includes research in education.
- Just like any other academic endeavor, mentorship is critical. The single most important factor for success is mentorship.3,5
- Hypothesis driven research that includes a power analysis and a clearly measurable outcome is a good place to start. While this may sound like basic advice to seasoned clinical researchers, many educational research projects are missing these fundamental requirements. I speculate that many educators develop a novel (and often enormously creative) program. After successful implementation of the new program, someone tells them, “This is great – you should write this up!” Unfortunately, this is not an effective way to conduct research and what results is not hypothesis-driven research, but an educational case report describing a novel program. Instead, work with a mentor to develop a hypothesis and a strategy to test it in a meaningful and measurable way.
- Work with a statistician who understands psychometrics to plan your study and analyze your results. It is a surprisingly common misconception among inexperienced educational researchers that consultation with a statistician is unnecessary until the study is complete and the analysis is to begin. Waiting to work with a statistician until the study is complete is like asking an architect to design your dream home after construction has begun. A good statistician will help you design your study such that you have the greatest possible chance of effectively testing your hypothesis. A statistician who is facile with psychometric analysis may be particularly helpful as psychometricians specialize in the objective measurement of skills and knowledge, abilities, attitudes, and educational achievement.
- In order to have meaningful results, you may need to conduct a multicenter trial. After developing a hypothesis and a prospective study to test it that is meaningful and measurable, many education researchers perform a power analysis only to realize that in order to complete their study, they must enroll more residents or students than they have access to. This is understandably common as most residency programs are relatively small – there are few programs with more than 60 (CA1-CA3) residents, which significantly limits the kind of trial that can successfully be completed. When this happens, I would encourage the reader to consider a multicenter trial. Meetings of the Society for Education in Anesthesiology (SEA) are a terrific place to meet like-minded educators, who might be interested in participating in your study. Working with a second (or third) center is never easy, but novel trial designs such as a cluster randomized trial or stepped-wedge trial may help overcome certain challenges.
- Learner satisfaction should not be the primary outcome of your study. When pressed to develop an outcome that is meaningful and measurable, many education researchers will fall back on “learner satisfaction” as a primary outcome. Superficially, this makes sense as we all want our learners to be satisfied and happy with their educational experiences. Unfortunately, learner satisfaction – especially if evaluated immediately following the educational intervention – is often inversely associated with efficacy.6,7 Collecting data on learner satisfaction is important, but it should not be the primary outcome.
- Think twice before conducting a survey study (but, if you must, make sure the questions have established validity and the results will be highly relevant to the specialty with enduring value). The ready availability of online survey tools such as SurveyMonkey (www.surveymonkey.com) have vastly simplified the mechanics of conducting a large survey. Most journals are extremely selective about the number of manuscripts they publish that report the results of a survey. In general terms, the survey must be highly relevant to the specialty, have established validity, appropriate analyses, and have a large number of participants in order to be generalizable. Further, there must be a sense that the results of the survey will be of enduring value to the readership. To that end, the Medical Education section of Anesthesia & Analgesia has received 18 submissions since 2016 that report the results of a survey. Of these 18 submissions, 16 were rejected and 2 were accepted.
- Don't forget about using large databases to measure outcomes. Large databases such as ASPIRE and MPOG can passively track the impact of an educational intervention on clinically relevant outcomes.9 or like techniques.
- Think long term. Just as in clinical research, we value long term outcomes more than short term outcomes.10 While demonstrating improved performance immediately following an educational intervention is admirable, demonstrating sustained performance improvements weeks to months (or years) later is more meaningful.11
- Remember that all core competencies are important. Most research in education focuses upon improving the medical knowledge of students, residents, or practitioners; however, there is more to being a physician than the accumulation of medical knowledge. In order to succeed as an anesthesiologist, one must possess both the medical knowledge and the humanistic desire to help and communicate with others: to have one without the other is to be incomplete.12 Research into other areas of training is critically important yet under-investigated by all but a few investigators.13
- Team science can work for education. Several departments have created education research laboratories (eLabs) that bring together interested individuals who can present work in progress, conceive new projects, discuss relevant literature, and cultivate and sustain a community of educational scholars.14
Obviously, with the exception of mentorship, all of the above are not always applicable to every study.
References
- Martinelli SM, Isaak RS, Schell RM, Mitchell JD, McEvoy MD, Chen F. Learners and Luddites in the Twenty-first Century: Bringing Evidence-based Education to Anesthesiology. Anesthesiology 2019;131: 908-928.
- Nemergut EC. Then & Now: Education in Anesthesia. Anesthesia & Analgesia 2012; 114: 5-6.
- Chopra V, Arora VM, Saint S. Will You Be My Mentor? Four Archetypes to Help Mentees Succeed in Academic Medicine. JAMA Intern Med 2018; 178: 175-176.
- Beech BM, Calles-Escandon J, Hairston KG, Langdon SE, Latham-Sadler BA, Bell RA. Mentoring programs for under-represented minority faculty in academic medical centers: a systematic review of the literature. Academic Medicine 2013;88: 541–549.
- Nafiu OO, Haydar B. Mentoring Programs in Academic Anesthesiology: A Case for PROFOUND Mentoring for Underrepresented Minority Faculty. Anesthesia & Analgesia 2019; 129:316-320.
- Kornell N, Hausman H. Do the Best Teachers Get the Best Ratings? Frontiers in Psychology 2016; 7:570.
- Bjork, EL, Bjork, RA. Making things hard on yourself, but in a good way: Creating desirable difficulties to enhance learning. In M. A. Gernsbacher and J. Pomerantz (Eds.), Psychology and the real world: Essays illustrating fundamental contributions to society, 2nd edition. pp. 59-68. New York: Worth, 2014.
- Forkin KT, Chiao SS, Naik BI, Patrie JT, Durieux ME, Nemergut EC. “Individualized Quality Data Feedback Improves Anesthesiology Residents’ Documentation of Depth of Neuromuscular Blockade Prior to Extubation” Anesthesia & Analgesia (in press)
- Mascha EJ, Sessler DI. Segmented Regression and Difference-in-Difference Methods: Assessing the Impact of Systemic Changes in Health Care. Anesthesia & Analgesia 2019;129: 618-633.
- Sessler DI. Long-term consequences of anesthetic management. Anesthesiology 2009;111: 1-4.
- Kleiman, AM, Forkin, KT, Bechtel, AJ, Collins, SR, Nemergut, EC, Huffmyer, JL. Generative Retrieval Leads to Increased Learning and Retention of Cardiac Anatomy Using Transesophageal Echocardiography. Anesthesia & Analgesia 2017;124(5): 1440-44.
- Huffmyer JL, Kirk SE. Professionalism: The "Forgotten" Core Competency. Anesthesia & Analgesia 2017; 125: 378-379.
- Mitchell JD, Ku C, Diachun CAB, DiLorenzo A, Lee DE, Karan S, Wong V, Schell RM, Brzezinski M, Jones SB. Enhancing Feedback on Professionalism and Communication Skills in Anesthesia Residency Programs. Anesthesia & Analgesia 2017; 125:620-631.
- Schwengel DA, Toy S. Innovation in Education Research: Creation of an Education Research Core. Anesthesia & Analgesia 2019; 129: 520-525.