There’s an AI tool out there for radiology practices that helps “curate” studies; the AI tool will review a virtual “stack” of head CTs and prioritize which ones should be reviewed the soonest. Presumably, the AI identifies findings that it thinks merit prompt review. For example, a head CT that might show signs of intracranial hemorrhage (obviously a life threatening situation) should get reviewed immediately so that the doctors taking care of the patient can start treatment or surgery as quickly as possible.
The virtual stack: today’s radiologists call it a queue, the line-up of studies waiting to be interpreted by a diagnostic radiologist. I call it a stack because when I trained, imaging studies were still printed on xray film and kept in individual patient folders, some thicker than others. These folders were then stacked on top of one another and as you read studies, your stack got shorter and shorter (hopefully).
As a side note, when I interpeted a CT of the abdomen and pelvis back in the day, there were a total of 48 images, printed on four full sheets of xray film. The exact same imaging study these days has over 1000 images thanks to multiplanar reconstructions, thinner slices, lung and bone windows, etc. Of course current studies are all interpreted off of a computer screen (no more printed studies) and you can scroll through hundreds of contiguous images pretty quickly. The only "stacks" that exist now are a few small stacks of outside imaging CDs (that are even now disappearing thanks to online imaging exchanges).
Where was I? Rambling on about stacks…ah yes: AI radiology tools and resident education. Here’s the challenge: again, back in the day, I’d have a stack of head CTs to interpret. I had to work through them, from top to bottom. I didn’t have an AI tool to prioritize which studies in the stack needed to float to the top for me to look at sooner. If a resident today used this AI tool, he or she is alerted to which studies to look at immediately. This is, in theory, a very positive thing for the patient- a faster diagnosis should (and that’s a big should, since today’s healthcare system isn’t always as efficient as it should be but that’s a blog post for another time) lead to faster treatment and better outcomes (again, a pretty big and unproven assumption).
The downside of AI-assisted outcomes improvement: the resident hasn’t learned what a normal head CT looks like. I think a trainee needs to work through the stack methodically to put together the imaging and the clinical history in order to really develop the interpretive skills necessary to be a good diagnostic radiologist. Having an AI tool effectively tell you which studies are abnormal (assuming the AI is accurate which is a huge if, and worthy of its own blog post in the future) deprives the resident of that experience of not knowing, of being uncertain, of having to make difficult diagnostic decisions.
Taking it a step further, what if the resident graduates and goes on to practice in setting where the AI tool isn’t available, or functional at any given time frame? Now they have to rely on their own wherewithal to figure out which studies are abnormal. It is difficult if not impossible to walk after getting used to being on crutches for a long time.
I'll take it even a step further: an experienced radiologist will lose "muscle memory" from using an AI tool. They are unlearning the hard-earned diagnostic skill that took years to accumulate.
The ethics of this are murky. On one side, you’ve (at least theoretically) got improved outcomes. On the other side, you have an undefined but undeniable impact on residency training as well as experienced radiologists. I really think that if the diagnostic radiology specialty were smart (and actually had capable leadership to act decisively), they would stop and very carefully consider the implications of these tools on residency training. Otherwise they may in fact be making a very strong case for their own replacement by AI tools by hobbling their own trainees.
NB: in 2018, Geoffrey Hinton made the prediction that diagnostic radiologists would be entirely replaced by AI within five years.