If you haven’t yet tried using an AI assistant in your clinical practice, now is the time to start.
We are standing at the threshold of a shift in how we work. The rise of large language models (LLMs)—text-based AI systems like Chat GPT that can interpret, generate, and summarize content—offers clinicians a remarkable opportunity: to work faster, think broader, and document smarter. I want to be clear that these tools are still evolving, but their usefulness in the day-to-day reality of musculoskeletal ultrasound is already tangible, even resulting in substantial changes.
In my own sports medicine practice, AI has become a quiet but powerful assistant. It’s not replacing clinical expertise; it’s extending it. Over time, I’ve found a sweet spot—not in making decisions for me, but in helping me think more clearly. One of the most practical ways I use LLMs is for differential generation. I paste in my ultrasound findings and impression and ask for a possible differential diagnosis list. The results are consistently thought-provoking. Typically, it reflects five or six diagnoses I already had in mind; throws in a couple I disagree with outright; and adds two or three that surprise me, and deserve a closer look. Especially in complex or uncertain cases that prompt a pause and consideration of something new that can be invaluable.
Some mainstream AI platforms even promise image interpretation. My experience? These are not yet ready for prime time. Results can be inconsistent; accuracy is still highly variable. But for text-based assistance—where language, not pixels, is the primary input—LLMs can make the difference.
One area where AI shines is in reducing the friction of tedious or repetitive tasks. Prior authorizations, for example, used to eat up valuable time and mental bandwidth. Now, I can copy a de-identified clinical summary and the insurance denial into an LLM and request a short appeal letter. It generates a polished draft that often needs only light editing. Occasionally, I’ll even ask the AI why it thinks the request was denied—it often gives helpful insight I can use in peer-to-peer calls.
The same applies to documentation templates. I’ve built standard templates for common joints, but what about when a patient presents with something less routine, such as a region I haven’t scanned often enough to have a template, like the sternoclavicular joint? I give the model an existing template and ask it to adapt it to the new joint. The results? Fast, accurate, and easy to refine. Here’s a quick look at how I use AI in daily practice:
- Differential support: Expands my diagnostic horizons, especially in unusual or complex cases.
- Template generation: Converts existing structures into less common regions or patient types with minimal effort.
- Prior auths & letters: Speeds up appeal writing; reduces emotional exhaustion from repetitive documentation.
- Note polishing: Transforms shorthand findings into clean, communicative notes for specialists or patients.
But let’s be clear: none of this replaces the responsibility we carry as clinicians. AI is a powerful tool, but it must be used wisely. A recent study from MIT (Your Brain on ChatGPT) found that users writing essays with AI support showed lower brainwave activity, suggesting a reduction in active cognitive processing. The lesson here is sharp: when we outsource too much thinking, our ability to reason, synthesize, and create diminishes.
We cannot allow that to happen in medicine. What we document, what we diagnose—these remain our responsibility. AI can offer suggestions, but only we can make decisions. Every recommendation must be filtered through our personal, sound clinical judgment.
So yes—use AI to sharpen your workflow, expand your thinking, and save time. But use it with intention. Let it challenge your thinking, not do your thinking. Let it shape your creativity, not replace it. When used well, AI doesn’t flatten our clinical voice; it amplifies it. It helps us become more precise, more efficient, and, most importantly, more present with the people we serve.
References: Kosmyna N, Hauptmann E, Yuan YT, et al. Your brain on ChatGPT: accumulation of cognitive debt when using an AI assistant for essay writing task. Preprint. Submitted June 10, 2025. Accessed 7/8/2025. Available from: https://arxiv.org/abs/2506.08872
James Wilcox, MD, RMSK, is a family medicine and sports medicine physician in the United Arab Emirates, where he is the Director of the ProMotion Sports Medicine Clinic at Specialized Rehabilitation Hospital in Abu Dhabi, and Assistant Professor of Family Medicine at UAE University..
This posting has been edited for length and clarity. The opinions expressed in this posting are the author’s own and do not necessarily reflect the view of their employer or the American Institute of Ultrasound in Medicine.










