Is “progress” in medical education backfiring?

by Karan Chhabra

A good bit of fuss surrounds the evolution of medical education. The way we’re taught sometimes seems as susceptible to trends than the clothes we wear—one decade trumpets the “integrated,” systems-based curriculum, the next decade moves to the “flipped classroom,” and the cycle continues. Of course, these changes are based on educational theory, and their outcomes are often studied rigorously. But every now and then, when I rotate with a doctor trained the old-fashioned way, I wonder what has really been gained—and whether something has been lost—since the days they were taught.

I’ve seen how older physicians recollect tidbits from their preclinical years (decades ago) that my classmates and I can’t seem to remember past a few months. I wonder how much of this is the result of the way they were taught. Of course, there are many confounders here: our own intelligence versus those physicians’, our generation’s perpetual state of distraction, and perhaps the volume of material we’re expected to retain. The “old way” of teaching by discipline (anatomy, pathology, pharmacology, etc.) seems far less intuitive than the way we’re currently taught, by organ system (cardiovascular, gastrointestinal, etc.). To mentally switch from psoriasis to anti-arrhythmics in the same day seems like work.

But that may in fact be the secret to its success. I’m referring to research on “massed practice,” rapidly learning subjects en bloc, as opposed to “interleaved practice,” switching between learning tasks rapidly and revisiting the same topics day after day. A recent article (thanks Skeptical Scalpel) shows a relevant example:


Consider this study of thirty-eight surgical residents. They took a series of four short lessons in microsurgery: how to reattach tiny vessels… Half the docs completed all four lessons in a single day, which is the normal in-service schedule. The others completed the same four lessons but with a week’s interval between them.


…The difference in performance between the two groups was impressive. The residents who had taken all four sessions in a single day not only scored lower on all measures, but 16 percent of them damaged the rats’ vessels beyond repair and were unable to complete their surgeries.


Why is spaced practice more effective than massed practice? … Rapid fire practice leans on short-term memory. Durable learning, however, requires time for mental rehearsal and the other processes of consolidation. Hence, spaced practice works better. The increased effort required to retrieve the learning after a little forgetting has the effect of retriggering consolidation, further strengthening memory.


This effect isn’t limited to technical skills. The article also references cognitive tasks, like geometry problems. I wonder if it’d extend to preclinical medical education as well. I can personally relate to how a disease seen in one organ system, say in November, may literally never be seen again under an “integrated” curriculum.  It’s far easier to learn the pathogenesis of strokes and their treatment in the same week—but easier is not always better. Perhaps a trickier, thornier learning process is also sticker in the long run.


Karan is a student at Rutgers Robert Wood Johnson Medical School and Duke graduate who previously worked in strategic research for hospital executives.

Follow him on Twitter @KRChhabra or subscribe to the blog.


5 thoughts on “Is “progress” in medical education backfiring?

  1. RJ says:

    Good post. Some thoughts – I think the interesting question here is: how to broadly (across the whole range of talent and ability) teach the fundamentals in such a way that they become intuitive? The progression in medical education styles seems like a slow, organic, chaotic march to a flexible, adaptive medical education that serves all students. The trickier, thornier process might be ‘stickier’ in the long run, but it may also box out students with less raw computational brainpower or students who learn in more wild, asystematic ways…and those are the students that more than anyone need the fundamentals and discipline on which the practice of medicine is built.

  2. Alexander Janke says:

    I’m a student at Wayne State University School of Medicine in Detroit. Our curriculum is not systems-based. We have subject classes: anatomy, physiology, pharmacology, et cetera. But there’s much chatter about switching to systems-based. When we do, we could evaluate the impact of that curriculum change with a simple before-and-after comparison. What happens to Step 1 scores? The ‘stickiness’ (to use your word) of our learning could be similarly evaluated: do strong Step 1 scores become less or more predictive of success in clinical rotations / on Step 2 after the curriculum change?

    Surely schools that have already made the transition from the “old-way” to systems-based learning have access to this data. The answer to your question is out there. (And if no one has done the work to find out, then we should.)

    • I’ll be honest — I did a cursory search of the literature and didn’t find any comparisons as I wrote this, but it’s highly possible I missed something. The problem with a pre-post like the one you’re describing is that it introduces a whole host of confounders: “growing pains” as faculty and students transition, new topics added and subtracted, and other changes to teaching methods. I worry that such a study wouldn’t be apples-to-apples. But I’d love to see a “modern” discipline-based curriculum (with all the other trappings of today’s methods) compared to a “modern” systems-based curriculum.


      • Alexander Janke says:

        Out in the real world, we care what the implications of change are for students. Some of the things you call confounders are part of the package deal for institutions in transition, and those things need to be part of the decision calculus for a school like mine. (Sometimes the growing pains make change prohibitively expensive in the short-run. See dvorak.) In addition, differences-in-differences can partially account for some issues you sight.

        Is your point that no empirical evidence can provide hints about the effect of curriculum change? Should we just speculate by analogy to other studies, as you do here? Rely on expert intuition? How precisely do you suggest we compare a “modern” discipline-based curriculum to a “modern” systems-based curriculum modulo the confounders you sight? The evaluation would surely be tricky; pre-post would be an easy evidence-based start to a tough problem (namely, how to teach medical students well).

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: