How has med school evolved over the last century—and where will it go next?
Let’s go back, shall we? Way back—all the way to 1910.
At the turn of the 20th century, there was no standardization for medical schools in North America. Institutions and organizations that claimed to train physicians kinda just did whatever they wanted. The result: wildly different programs that produced doctors with wildly different understandings of how medicine was supposed to be practiced.
So, in 1910, the American educator Abraham Flexner was hired to visit every one of the 155 med schools operating in Canada and the United States. He summarized his findings in the infamous Flexner Report, which would change the face of medical education for the next 100+ years.
Abe didn’t pull any punches. While a few schools were praised for their excellence and used as models for reform (McGill and the University of Toronto among them), many more were the subject of brutal criticism; he described the fourteen med schools then operating in Chicago as “indescribably foul” and "a disgrace to the State whose laws permit its existence.”
The changes that resulted from the Flexner Report included the closure of more than 120 schools, and dramatic increase in the prerequisites students needed to enter med school (yeah, you can thank him for those GPA minimums), and the standardization of curricula to ensure that all doctors were learning the same thing, and, likewise, would practice the same way.
With that, the modern med school was born: competitive admissions, studying textbooks at home and learning collaboratively in the classroom, clinical rotations, etc. But today, as technology advances at an exponential rate, as demographics continue to shift, as new health care challenges emerge, the world of medical education is undergoing another evolution—and it’s evolving fast. What does that mean for pre-meds who are trying to figure out where to apply (or whether to apply at all) and med students who are just beginning their education?
Well, for starters, it has never been harder to get into med school. More people than ever are applying to study medicine (from 8,386 in 1985 to 13,929 in 2018) and the percentage of those applicants who receive at least one offer is falling lower and lower (from 23% to 19% over that same period).
The way students are recruited and admitted is changing, too. Anyone who has already made it through the application process knows how rigorous (and often disheartening) it can be. Med schools have traditionally focused on the more objective aspects of a candidate’s qualifications: grades, MCAT scores, awards, achievements, and, eventually, face-to-face interviews about all those scores and awards and achievements. Which is all find and good, except that those particular methods of assessment have been shown to disfavour racialized and lower-income applicants.
Recently, after analyzing their admissions data, the University of Manitoba discovered that high-income white students from urban centers were more likely to be invited for interviews and more likely to receive an offer of admission. Why? Well, when you have the financial flexibility to pay for professional MCAT and interview preparation (and aren’t spending your time working a full- or part-time job to pay your application fees) the advantages are pretty obvious.
But if med schools in Canada are truly searching for the best and brightest to join the next generation of physicians, they’re going to have to look at how other industries recruit the most promising students into their organizations. Job interviews at big tech companies like Google and Facebook are famous for ignoring an applicant’s curriculum vitae and focusing on their innate creativity, communications, and critical-thinking skills.
The Multiple Mini-Interview method is an example of how some of the assessment processes in med schools are moving in that direction. The MMI was developed right here in Canada, at McMaster University, specifically to evaluate med school candidates. The format requires interviewees to run a circuit of stations where they have brief one-on-one interactions with interviewers. This approach has been proven to better judge non-cognitive, interpersonal, and ethical aptitude, and, even more importantly, is able to better predict not just an applicant’s performance as a med student, but their future performance as a practicing physician. Since it was first introduced in 2004, the MMI format has been adopted by medical, dental, pharmacy and even veterinary schools across the world.
And what about med school itself? How has the way medicine’s been taught changed over the years? Well, the structure of med school has evolved quite a bit in the last few decades. As technology makes data easier to access and analyze, less emphasis is being put on the acquisition of empirical knowledge in favour of hands-on, real-world experience. There’s an increased emphasis on case-based learning, and clinical experience is introduced earlier and earlier. Students are being exposed to a far broader range of settings, situations, and learning approaches. In short: med students are expected to build competency rather than build up a repository of facts—it’s not what you know, but how you apply what you know.
Why is this important? Because the pace at which medicine is advancing is so fast that the method you’re using one year is very likely to be replaced the next—you can’t just learn a particular protocol and practice it for years and years.
Technological advancement is shaking things up, too. Just this past year, several med programs in the U.S. abandoned the traditional practice of cadaver dissection, replacing it with a virtual reality tool that allows them to probe a three-dimensional rendering of the human body—all without getting their hands dirty (or that, uh…unique smell).
This allows students to get a clearer, more comprehensive view of the human anatomy; not only can they use these augmented-reality tools to see an organ from every possible angle, they can also see how it’s connected to the rest of the bodies vital systems, and even watch it while it’s at work (something you definitely can’t do with a dead body).
While this might seem like something straight out of utopian sci-fi TV series, there are drawbacks. Virtual reality can show you a lot of things, but it can’t give you a true sense of depth and texture and definitely can’t account for the natural variations in human anatomy physicians will see throughout their careers. Add to that the fact that there’s a certain level of detachment from the reality of interacting with actual, tangible human bodies—which is something every physician will eventually have to reckon with.
Back in the early 1900s, most med schools in North America were in a dangerous state of disorder (just imagine, for a moment, being treated by a doctor who had graduated from one of those Chicago institutions Abraham Flexner described as “indescribably foul”). Reform and standardization helped guide them – and the millions of doctors they produced – throughout the 20th Century. Today, med schools face much different challenges—ones that may require equally radical changes. The good news is, students like you are going to be the ones who initiate those changes.
In our recent interview with Dr. Brian Goldman, he noted that the world of medicine is undergoing a significant shift, and he had an idea about why that was happening. “More women than men are now entering med school, and soon, hopefully, will be entering positions of power. There may be a tipping point coming soon, a major change in the culture of medicine. Younger physicians have a greater willingness to question the status quo—the question I would pose to this new generation is: will you continue to do that, or will you be silenced?”
About the AuthorMore Content by onboardMD Team