What Can the Legal Profession Learn from the Medical Profession About the Next Steps?

University of St. Thomas Law Journal, Jun 2018

By Dr. Eric Holmboe and Dr. Robert Englander, Published on 06/05/18

A PDF file should load here. If you do not see its contents the file may be temporarily unavailable at the journal website or you do not have a PDF plug-in installed and enabled in your browser.

Alternatively, you can download the file locally and open with any standalone PDF reader:

https://ir.stthomas.edu/cgi/viewcontent.cgi?article=1415&context=ustlj

What Can the Legal Profession Learn from the Medical Profession About the Next Steps?

This Article is brought to you for free and open access by UST Research Online and the University of St. Thomas Law Journal. For more information W hat Can the Legal Profession Learn from the Medical Profession About the Next Steps? Dr. Eric Holmboe Dr. Robert Englander - Article 5 ARTICLE WHAT CAN LEARN FROM THE THE ABOUT THE LEGAL PROFESSION MEDICAL PROFESSION NEXT STEPS? DR. ERIC HOLMBOE* & DR. ROBERT ENGLANDER** This presentation summary from the February 17, 2017, Symposium is a synthesis of Dr. Eric Holmboe’s and Dr. Robert Englander’s combined thirty years of experience with both medical education’s movement toward competency-based education, and the most useful lessons learned by medical educators applicable to legal education’s current move toward competency-based education. Medical education is in the midst of major change and transformation.1 While there are a number of factors contributing to this change, a primary driver has been the recognition that the quality, safety, and costs of care in the United States healthcare system are problematic. Early signals and evidence of problems in quality and patient safety began to surface in the 1960s and 1970s. Archibald Cochrane in the United Kingdom focused on issues of effectiveness and efficiency,2 while Jack Wennberg first detected unwarranted variations of practice among physicians in Vermont.3 Robert Brook, of UCLA, and others began to recognize the pernicious issue of poor quality, suboptimal outcomes, and medical errors.4 Finally, in the late 1970s the World Health Organization published an important white paper arguing medical education around the globe should adopt a competency and mastery-based approach to produce proficient healthcare professionals to * Dr. Eric Holmboe, a general internist, is currently Senior Vice President for Milestones Development and Evaluation at the Accreditation Council for Graduate Medical Education. ** Dr. Englander is Associate Dean for Undergraduate Medical Education and Professor of Pediatrics at the University of Minnesota Medical School. He has been actively engaged in the paradigm shift to competency-based Medical education at the UME and GME levels. 1. See Carol Carraccio et al., Shifting Paradigms: From Flexner to Competencies. 77 ACAD. MED. 361 (2002). 2. See generally A.L. COCHRANE, EFFECTIVENESS & EFFICIENCY (1972). 3. See generally John E. Wennberg, Forty Years of Unwanted Variation—and Still Counting, 114 HEALTH POL’Y 1 (2014). 4. See generally Robert H. Brook et al., Assessing the Quality of Medical Care Using Outcome Measures: An Overview of the Method, 15 MED. CARE, Sept. 1977, at i. better meet the needs of populations in a local context.5 They noted, “The intended output of a competency-based programme is a health professional who can practise medicine at a defined level of proficiency, in accord with local conditions, to meet local needs.”6 Concerning signals in quality and safety continued to percolate throughout the healthcare system and medical education over the next twenty years, culminating in the release of two seminal reports from the Institute of Medicine (IOM): To Err is Human in 20007 and Crossing the Quality Chasm in 2001.8 These two reports demonstrated the magnitude and seriousness of the poor quality of care, medical errors, and inadequate patient safety. To Err is Human estimated that 98,000 people per year were dying from medical errors.9 This statistic was very controversial at the time, and sadly it was likely an underestimate. A recent study published in 2016 suggested that medical errors could be the third leading cause of death in the United States.10 Paralleling the IOM reports, the medical education community began to recognize that, as useful as the structured Flexnerian model of medical education had been, it was no longer sufficiently preparing students and residents to meet the challenges of a changing and dynamic healthcare system. In response, medical educators began the journey to an outcomesbased education system, using competency frameworks developed in the late 1990s.11 In 1999, the Accreditation Council for Graduate Medical Education (ACGME), together with the American Board of Medical Specialties (ABMS), approved a framework of six general competencies: patient care, medical knowledge, professionalism, interpersonal and communication skills, practice-based learning and improvement, and systems-based practice. It is important to note ACGME and ABMS comprise the two predominant organizations of physician self-regulation, where ACGME accredits training programs, and the twenty-four specialty boards of the ABMS certify individual physicians using the general competency framework. The Outcomes Project was formally launched in 2001 by the ACGME to enable the implementation of the six general competencies in all United 2018] States post-graduate medical specialty training programs.12 Two of these competencies presented new concepts that have been challenging to implement, requiring further explanation. Practice-based learning and improvement (PBLI) focuses on self-directed learning and improvement, the use of performance data to drive improvements in quality and safety, and effective application of evidence-based practice. Both the self-directed learning and assessment and evidence-based practice components of PBLI recognize that students and residents will not come out knowing everything. Systemsbased practice is another critical new competency that emphasizes that all physicians, including students and residents, work and learn in healthcare systems with other healthcare professionals and in inter-professional teams. They are required to navigate the system to benefit their individual and collective patients, to identify and correct system errors, and to coordinate care across a diversified set of resources and healthcare professionals. As a result of these changes, the medical education community recognized the need to re-conceptualize the approach to educational design. The early models emphasized building a curriculum, usually through expert consensus, that focused on linking educational objectives to meaningful assessment. Furthermore, medical educators realized that the focus of assessments was predominantly medical knowledge and failed to address the other competencies vital to clinical practice. In essence, medical education’s approach to assessment was “if you’re really smart cognitively, you’ll be fine.” The medical education community eventually realized the overemphasis on cognitive skills is insufficient to meet patient and population needs. As highlighted by Frenk and colleagues, the medical education enterprise must start with the health and healthcare needs of the systems and the population served.13 The critical competencies (i.e. physician individual abilities) flow from those needs and must align with both clinical and educational outcomes. The general competencies are designed to meet the health and healthcare needs of systems and populations, and to hopefully reduce errors, improve quality, and ensure patient safety. Curriculum and assessment should follow from the desired outcomes, in direct contrast to the old model in which curriculum and assessment drove the outcomes. Discomfort still abounds around the assessment, but any good educational program must have a robust, multifaceted assessment program. It is also essential that curriculum and assessment are integrated; assessment drives learning (curriculum), but the learning should also drive the choice of an appropriate assessment. UNIVERSITY OF ST. THOMAS LAW JOURNAL In 2002, Carol Carraccio and colleagues pushed the medical education community to transition from a structure/process and time-based model, and think critically about what is it learners need to be able to do (i.e. outcomes) at each stage of their career.14 Medical educators began to ask what continuous professional development needs to look like for those coming out of residency and fellowship. Medical education needs to get out of what many call the tea-steeping model of competence. The so-called tea-steeping method refers to placing a tea bag in a cup of hot water for just the right amount of time so as to brew a good cup of tea.15 A dwell time model is no longer adequate for health profession education. Medical educators and the public need to know what a student and resident can actually do. This focus on outcomes is the paradigm shift trying to be realized in medicine. For example, the typical training program for a general internist is to complete thirty-six months of training, of which twenty-four months has to be in direct patient care, one month in the intensive care unit, one month in the emergency department, and so forth. If a resident’s faculty evaluations were acceptable, and no critical incidents occurred, the resident is deemed by a program director to have successfully completed the program. In an outcomes-based world, the right question is “what must a student or resident demonstrate before he or she leaves? What are those abilities (competencies), comprised of integrated knowledge, skills, and attitudes, that are necessary to do the actual work?” In addition to a focus on outcomes rather than structure and process, competency-based education shifts the driving force from the teacher to the learner. While structure and process based education focuses on the teacher disseminating information and being the “sage on the stage,” with competency based education the responsibility to achieve the learning outcome becomes a partnership where the teacher and the learner are co-educators working together. The assessment construct also shifts significantly in moving from a structure/process to a competency-based model. In the time-based education model, there was and still is an overreliance on tests, generally written and multiple choice in nature. In a competency-based model, assessments are more “in the trenches” and require direct observation of the learner’s performance when caring for actual patients. The learner may also keep a log of what he or she does and then reflect on those experiences and assessment data. This allows the learner to become more self-directed. The timing of assessments will be ongoing and continuous. Formative assessment, assessment that drives future learning forward and helps catalyze the learner, predominates in the competency-based assessment program. 14. Carraccio et al., supra note 1. 15. See Brian David Hodges, A Tea-Steeping or i-Doc Model for Medical Education?, 85 ACAD. MED., Sept. Supp. 2010, at S34. Applying this logic to a potential legal education example such as Moot Court produces the following questions. First and foremost, prior to assessing learner performance, it is essential to ensure the learner understands the desired outcome from the Moot Court experience. From an assessment perspective, if a student is participating in Moot Court, does a structured debrief occur with formative feedback to help the learner improve? How does the student break down their actual performance in order to improve at the next competition? Does the student understand how to make their legal argument with more skill?16 In addition to an emphasis on direct observation, formative assessment, and multiple assessment methods, competency-based education shifts the assessment model from norm-referenced (i.e. comparative) to criterionreferenced. In the former model, faculty working with learners in the clinical setting would mostly compare a learner’s performance against either themselves (“how I would do it”) or other learners they have encountered. What the faculty should be asking is whether the learner demonstrates the desired outcome. For example, can the learner actually do a procedure? Can she actually break bad news to a patient effectively? Can she manage patients with chronic diseases like diabetes, heart failure, and hypertension all at the same time? Thus, the comparison is not to one’s peer group, but rather to a set of pre-determined outcomes determined by the discipline to be important to the healthcare system, patients, and populations. Two additional points merit emphasis with respect to learning outcomes. First, the desired outcomes or competencies needed to meet the needs of the public will change over time. As an example, two fields are changing dramatically: radiology and pathology. Most people in these fields have been trained how to interpret an image, such as a digital radiologic image on a computer screen, or read a microscopic slide. However, these activities are now increasingly being performed by artificial intelligence through machine learning.17 Some software programs are now able to read millions of films in a very short period of time, often at a much lower cost.18 The actual work for these specialties is beginning to shift. Radiologists and pathologists are becoming information specialists, shifting from image interpretation to interpretation of the findings in context of each patient’s specific needs. This will likely require these specialists to spend more time interacting and consulting with patients and other healthcare professionals, thereby elevating the importance of competency in communication and interprofessional teamwork. This type of change requires revisiting the competencies needed for these specialties, then revisiting the performance levels and standards, the assessment framework, and the curriculum on an ongoing basis. Second, for too long medical educators have treated the attainment of competence as a ballistic function, such as a rocket. Too often, the goal has been to ensure the launch angle of the learner is aligned with, at a minimum, competence at graduation and to get the learner high enough up into orbit so that they retire before they burn up upon reentry. Clearly this is not an optimal model. The goal should be to create a professional developmental trajectory where the graduate’s learning continues over their career. The focus on general competencies in medicine has helped to create a shared mental model around the core abilities needed by physicians for twenty-first century practice. The next stage of evolution in the thinking of the medical education community, after defining the core competencies, was to develop a model of how the learner should proceed through a series of developmental stages in each competency. The resultant strategy was to adjust curriculum and assessment to facilitate that developmental progression. Medical educators realized learners do not progress in a straight line. The new paradigm focuses on what advances a student from being a novice to competent, then proficient, and perhaps then move toward mastery in their careers. The medical education community recognized the need to build a narrative mental model of how development through stages occurs. This led to the creation of Milestones.19 Milestones provide a simple concept, each one representing a significant point in a learner’s development described using narrative and key terms. We used the developmental framework for expertise created by the Dreyfus brothers.20 Milestones describe what a trajectory should look like so that learners can track their own progress toward an outcome, and help programs recognize advanced students or those who need extra help. In the end, educators want to ensure learners are developing an individualized learning plan as part of the professional developmental process. Each specialty has its own set of Milestones to describe the six general competencies in terms pertinent to their specialty.21 These specialty-specific Milestones now serve as a mental model to guide training. The importance of a shared mental model cannot be overemphasized, as it is a cornerstone to implementing (i.e. operationalizing) a competency-based approach. 19. ERIC S. HOLMBOE ET AL., THE MILESTONES GUIDEBOOK (2016), https://www.acgme.org/ Portals/0/MilestonesGuidebook.pdf [hereinafter MILESTONES]. 20. See Batalden et al., supra note 13. 21. Milestones by Specialty, ACCREDITATION COUNCIL FOR GRADUATE MED. EDUC., http:// www.acgme.org/What-We-Do/Accreditation/Milestones/Milestones-by-Specialty. The general architecture starts with one of the general competencies that are further described in a limited set of sub-competencies. The subcompetencies are described using behavioral narratives for each stage of the five Dreyfus levels: novice, advanced beginner, competent, proficient, and expert.22 Program directors do not expect most of their residents to achieve the level of expert at the time of graduation. The recommended target is proficiency or, for most specialties, level four on the Milestones rubric. The Milestones give residents a framework that helps to define what the training phase of their career should be. Providing a description of expertise or higher order skills in levels four and five provides resident learners target goals for early practice. The first set of Milestones was published in 2009 by the internal medicine community after two years of effort, with all specialties beginning work in 2010. By 2013, seven specialties were reporting Milestones data twice a year to the ACGME. We now collect information on 133,000 trainees spread out over 10,000 programs every six months, which we use in collaboration with the educational community to promote continuous quality improvement in medical education.23 Analysis of this information has led to important new discoveries. For example, we’ve developed this model based on learning curves (Figure 1).24 Law students likely feel that the steep part of this figure describes how their life looks over the three years in law school. We have used these Milestones to guide this journey. The Milestones help to integrate the curriculum and assessment activities to help ensure medical educators are assessing and teaching these critical competencies. Learners move at different rates. For example, Look at Figure 1.25 Student A is doing fine. That student is still going to have ups and downs, as education is not a simple linear process. Conversely, learner B is having difficulty and needs an intervention.26 The Milestones are allowing educators to identify and measure when a learner is off trajectory.27 Educators can step in and say a learner is struggling, whether it be in the area of professionalism, communication skills, knowledge, and so on. The graduate medical education system also now uses a group process, called the Clinical Competency Committee (CCC), to make developmental judgments using the Milestones framework.28 CCCs meet to review all the assessment data and determine where the learner is developmentally. The Milestones judgments are then fed back to the learner who in turn uses that information to generate an individualized learning plan. The goal is for the learner to understand their trajectory toward later stages of development. This information is also used to improve the overall program. We may learn that we have a group of residents not doing well in a particular area, which could possibly indicate a curricular problem. In fact, our research has already revealed such problems in certain areas.29 This data allows for deeper analysis. For example, Figure 2 shows data from OB/GYN.30 We can now look at every single resident in the country, look at distributions, and determine where a particular resident is in their development relative to the population. For example, one Milestone is called “patient care five,” which entails stabilization of the newborn.31 The residents’ developments distributions reveal a tremendous spread.32 This data enables a re-visitation of the national curriculum in obstetrics and gynecology training to ensure this competency is being effectively taught and 2018] WHAT CAN THE LEGAL PROFESSION LEARN 353 assessed. Without the comprehensive Milestone data, it would not have been possible to spot a possible national curricula weakness. Distribution curves of educational outcomes also prove useful. For example, Figure 3 shows neurological surgery.33 Of the eight domains of neurosurgery, just fifty-four percent of graduating neurosurgery residents in 2014 (who spent seven years learning their craft) had met level four in all eight domains.34 At first this may look very concerning, but a deeper look into the data revealed very helpful lessons that have guided the community as it works collectively to advance the discipline of neurosurgery. First, residents in neurosurgery training programs have variable access to various types of patients and neurosurgical procedures depending on such factors as region of the country, size of institution, and so forth. Second, it is perfectly logical that not every resident is going to reach level four proficiency in all neurosurgical procedures during a residency.35 Residents graduate as “unique packages” based on where they train and their own individual interests within the field of neurosurgery.36 In fact, it is actually good news that 33. Eric Holmboe, Address at the University of St. Thomas School of Law Journal Symposium (Feb. 17, 2017) (using Figure 3 in PowerPoint Presentation) (on file with author). 34. Id. 35. Id. 36. Id. 30-APR-18 10:42 354 UNIVERSITY OF ST. THOMAS LAW JOURNAL neurosurgery programs have a better sense of what the packages look like. This information can guide programs in developing curricula and assessments and provide more specific career coaching for learners. Having developed a framework and a mental model with Milestones, the logical progression shifted the focus to how this information can be used to improve assessment. The Milestones created a conversation around a shared mental model of competence, the next step is to determine how educators could build on the Milestones to determine when a learner no longer needs supervision and can be trusted to do the professional work on their own. Medical educators needed an integrative construct to help with supervision and progression decisions in an integrated approach based on the work, or activities, of the specialty. Exciting work originally developed in the Netherlands by Olle ten Cate, called an entrustable professional activity (EPA), is also helping to transform medical education.37 What is an EPA? Olle ten Cate formally defined an EPA as a unit of professional practice that can be entrusted to a sufficiently competent learner or professional.38 An EPA requires integration of several competencies; for example, a general internist, in order to provide high quality diabetic care, needs to have abilities in all of the general competencies, not just knowledge. A student such as this must work in an interprofessional team, coordinate numerous tests and referrals for the patient, and demonstrate professionalism in their work. Entrustment means that a resident learner is ready to care for these patients without supervision. This entrustment decision is significant because when the learner is entrusted, they assume full responsibility for caring for people and no longer enjoy a safety net. There are some important distinctions between competencies and EPAs. With a competency, the unit of assessment is the ability of an individual on that competency. With an EPA, the unit of assessment is the activity itself. EPAs are always embedded in the clinical context. Competencies tend to be more context-independent, granular, and specific; they help faculty and learners understand the parts of a clinical activity. Think of competencies as a type of a telephoto lens. Educators can zoom in with competencies and prove very helpful with the struggling learner by defining the critical components of a clinical activity. Competencies can determine why the learner may be struggling. Is it a knowledge issue? Is it a communication issue? EPAs, on the other hand, allow for the integration of the competencies, and they are more holistic. EPAs can also be very important with respect to professional identity formation because so much of professional identity is determined by the activities the learner does and how well the learner performs these activities. For a medical student, entrustment refers to the ability to effectively and safely perform a professional activity without direct supervision; for a resident this ultimately means without supervision of any kind. The ultimate decision whether to entrust learners keeps the authors of this text up at night. Many times, the authors would leave the hospital in the evening with a resident on call without direct supervision. The key question was always whether the resident could be entrusted to perform a good medical history and physical examination to make good treatment decisions. More importantly, the authors asked themselves whether they could trust the resident to know when to call them, correctly identifying when he or she really did not know what was going on, or a patient was deteriorating. Thus, trusting learners to be honest and trustworthy was a major aspect of the entrustment decision. The medical education community also recognized that EPAs make intuitive sense to faculty because entrustment is the kind of decision faculty routinely make. When a faculty member is working with a learner during clinical rotations, the faculty member can add trust to the conversation as an explicit criterion. The EPA concept really forces the faculty member to consider whether the learner is truly ready to be trusted with additional responsibility. Faculty want to base this decision on a mental model of outcomes and good assessment before they make that decision. [Vol. 14:2 Englander and colleagues highlight how EPAs relate to the domains of competence, their competencies, and their milestones.39 The Milestones provide the narrative description of the competencies and lay out the trajectory of an EPA. In total, EPAs, competencies, and Milestones can create a story of what an early learner looks like, and what a learner should look like when they graduate from medical school and are entrusted to move on to residency. There are thirteen core EPAs for medical school as preparation for any residency.40 These EPAs are now being piloted in a series of medical schools across the country.41 In the end, the authors recommend that legal education explore the use of competencies, Milestones, and EPAs as part of the educational process. Milestones and EPAs are critical for assessment. We are beginning to pull these two concepts together, and we can now describe in rich narrative language what this development looks like to become a professional in internal medicine, surgery, radiology, or whatever specialty a learner may pursue. This journey has been a real opportunity to rethink what it means to be a physician. It has been incredibly helpful because of the many conversations within the profession; perhaps the most important aspect of the Milestones were the iterative conversations in the specialty communities. The Milestone and EPA initiatives brought the specialty and medical school communities together to talk about what they expected a learner to be able to do at the end of their training. Finally, measurement must be built into the process from the very beginning. Measurement is important because some initiatives will not work. These authors implore educators to recognize when initiatives are failing, get rid of them, and try something else. This will be important as we continually strive to learn, revise, adapt, and improve medical education. 5. William C. McGaghie et al., Competency-Based Curriculum Development in Medical Education , 68 PUB. HEALTH PAPERS 1 ( 1978 ). 6. Id. at 18. 7. INST. OF MED ., TO ERR IS HUMAN (Linda T. Kohn et al. eds., 2000 ) [hereinafter TO ERR IS HUMAN] . 8. INST. OF MED ., CROSSING THE QUALITY CHASM ( 2001 ). 9. TO ERR IS HUMAN, supra note 9 , at 26. 10. Martin A. Makary & Michael Daniel, Medical Error-the Third Leading Cause of Death in the US , BMJ, May 2016 , at 1. 11. Victor R. Neufeld et al., Educating Future Physicians for Ontario , 73 ACAD. MED . 1133 ( 1998 ); Paul Batalden et al., General Competencies and Accreditation in Graduate Medical Education, 21 HEALTH AFF . 103 ( 2002 ). 12. See Susan R. Swing , The ACGME Outcome Project: Retrospective and Prospective , 29 MED. TCHR. 648 , 648 ( 2007 ) ; see also Carraccio et al ., supra note 3. 13. See Julio Frenk et al., Health Professionals for a New Century, 376 LANCET 1923 , 1950 ( 2010 ). 16. ANDERS ERICSSON & ROBERT POOL , PEAK: SECRETS FROM THE NEW SCIENCE OF EXPERTISE ( 2016 ). 17. See generally ERIK BRYNJOLFSSON & ANDREW MCAFEE, THE SECOND MACHINE AGE ( 2014 ). 18. Saurabh Jha & Eric J. Topol , Adapting to Artificial Intelligence: Radiologists and Pathologists as Information Specialists , 316 JAMA 2353 ( 2016 ). 22. MILESTONES, supra note 21, at 11; see also Batalden et al., supra note 13 , at 106 (explaining the five Dreyfus levels ). 23. STANLEY J. HAMSTRA ET AL., ACCREDITATION COUNCIL FOR GRADUATE MED . EDUC., MILESTONES ANNUAL REPORT 2017 (Oct . 2017 ), http://www.acgme.org/Portals/0/PDFs/Mile stones/MilestonesAnnualReport2017.pdf?ver= 2018 -02-09-074057-013; Eric S. Holmboe et al., Reflections on the First 2 Years of Milestone Implementation, 7 J. GRADUATE MED . EDUC. 506 , 506 ( 2015 ). 24. Martin V. Pusic et al., Learning Curves in Health Professions Education, 90 ACAD. MED . 1034 , 1040 fig. 6 ( 2015 ). 25. Eric Holmboe , Address at the University of St. Thomas School of Law Journal Symposium (Feb. 7 , 2017 ) (using Figure 1 in PowerPoint Presentation) (on file with author). 26. Id . 27. Eric S. Holmboe et al., Milestones and Competency-Based Medical Education in Internal Medicine, 176 JAMA INTERNAL MED . 1601 ( 2016 ). 28. KATHRYN ANDOLSEK ET AL., ACCREDITATION COUNCIL FOR GRADUATE MED . EDUC., CLINICAL COMPETENCY COMMITTEES: A GUIDEBOOK FOR PROGRAMS (2d ed. 2017 ), https:// www.acgme.org/Portals/0/ACGMEClinicalCompetencyCommitteeGuidebook.pdf. 29. HOLMBOE ET AL., supra note 22 , at 1601-1602. 30. Eric Holmboe , Address at the University of St. Thomas School of Law Journal Symposium (Feb. 17 , 2017 ) (using Figure 2 in PowerPoint Presentation) (on file with the author) . 31. Id . 32. Id . 37. Olle ten Cate & Fedde Scheele, Competency-Based Postgraduate Training: Can We Bridge the Gap Between Theory and Clinical Practice? , 82 ACAD. MED . 542 ( 2007 ). 38. Id . 39. Robert Englander et al., Toward a Common Taxonomy of Competency Domains for the Health Professions and Competencies for Physicians, 88 ACAD. MED . 1088 ( 2013 ). 40. The Core Entrustable Professional Activities (EPAs) for Entering Residency , ASS'N AM. MED . C., https://www.aamc.org/initiatives/coreepas/ (last visited Jan. 28 , 2018 ). 41. See Kimberly Lomis et al., Implementing an Entrustable Professional Activities Framework in Undergraduate Medical Education: Early Lessons From the AAMC Core Entrustable Professional Activities for Entering Residency Pilot, 92 ACAD. MED . 765 ( 2017 ).


This is a preview of a remote PDF: https://ir.stthomas.edu/cgi/viewcontent.cgi?article=1415&context=ustlj

Dr. Eric Holmboe, Dr. Robert Englander. What Can the Legal Profession Learn from the Medical Profession About the Next Steps?, University of St. Thomas Law Journal, 2018,