
A clinician-centered approach to solving healthcare's most persistent workflow challenge
Ken Dec, Chief Marketing Officer, mTuitive
Every surgeon knows the paradox intimately: the more meticulously we document our work, the less time we have to actually do it. Yet incomplete documentation doesn't just create billing headaches, it undermines clinical decision-making, obscures quality metrics, and leaves cancer registries struggling to track outcomes that matter.
At mTuitive, we've spent years watching this tension play out in operating rooms and oncology clinics. And we've learned something crucial: the problem isn't that clinicians resist structured documentation. It's that most documentation tools were never designed with the clinician's cognitive workflow in mind.
What Oncology Quality Initiatives Taught Us About Documentation
Recent quality improvement initiatives in oncology have revealed something remarkable. When structured fields for malignant diagnosis and staging are integrated directly into clinical closure workflows, rather than bolted on as afterthoughts, capture rates can jump from roughly 65-70% to over 90%. More importantly, these improvements sustain over time because they align with how clinicians actually think and work.
The key wasn't adding more clicks or creating longer forms. It was about meeting surgeons at the moment of clinical synthesis, that critical point when they've completed the procedure, formed their assessment, and are ready to communicate findings. By embedding structured capture into this natural cognitive transition, documentation stops feeling like administrative burden and starts feeling like clinical clarity.
The Design Philosophy: Start With the Surgeon's Eye
Here's where most AI documentation tools get it wrong: they start with the registry requirements, billing codes, or quality metrics, then try to extract that data from clinical notes. This approach treats the surgeon as a data entry clerk who happens to hold a scalpel.
Its time to flip that model entirely.
Build structured capture around that clinical reality, layering in AI assistance and registry needs in ways that feel natural to surgical workflow.
Consider a gynecologic oncologist completing a staging procedure. Their mental model isn't organized around AJCC codes or SEER registry fields. They're thinking about tumor location, depth of invasion, nodal involvement, and extent of resection. Interfaces need to mirror that clinical reasoning pattern, presenting structured fields in the sequence a surgeon would naturally articulate findings to a colleague.
The AI then works in the background, suggesting appropriate staging based on documented findings, flagging potential inconsistencies, and pre-populating downstream fields. The surgeon experiences this not as additional burden, but as intelligent assistance that makes documentation faster and more accurate.
Real-Time Correction: The Difference Between Assistance and Annoyance
Here's a critical distinction that separates helpful AI from frustrating AI: timing and context.
AI that interrupts the surgeon mid-procedure with suggestions is annoying. AI that waits until the moment of documentation, then quietly highlights potential inconsistencies or missing elements, is genuinely helpful.
For example, if a surgeon documents tumor size and nodal status that would typically indicate Stage IIB disease but selects Stage IIA from a dropdown, our system flags this discrepancy in real time. Not with alarm bells or modal dialogs, but with a subtle indication and a one-click option to review the staging logic. The surgeon maintains complete control while benefiting from an intelligent second check.
This approach respects clinical autonomy while reducing the cognitive load of mentally cross-checking staging criteria against documented findings, particularly valuable in complex cases or across different tumor sites with varying staging systems.
Intelligent Suggestions Without Adding Clicks
The "without adding clicks" part of this equation is non-negotiable. Any AI assistance that requires additional navigation, form submission, or interface interaction will be abandoned, no matter how intelligent the underlying algorithm.
The KEY design principle required is simple: intelligence should reduce friction, not add steps.
When a thoracic surgeon documents a lung resection with specific anatomic findings, the system can intelligently suggest relevant structured fields without requiring explicit action. If the documentation mentions "complete visceral pleurectomy," the system knows to present fields for pleural staging and completeness of resection. If it doesn't mention lymph node dissection, the system presents a single-click option to document that nodes weren't sampled, rather than leaving ambiguity.
These micro-interactions, powered by AI but designed around clinical workflow, compound into significant time savings while dramatically improving data completeness.
From Billing Form to Clinical Tool: A Fundamental Shift
The ultimate measure of success for structured documentation isn't capture rates or registry completeness, though those improve dramatically. It's whether clinicians experience the tool as supporting their clinical thinking rather than extracting data from it.
When getting this right, surgeons report that the documentation interface helps them organize their thinking, ensures they haven't missed important findings, and produces reports they're proud to sign. The structured data that registries and quality programs need becomes a byproduct of good clinical documentation, not a separate administrative task.
This represents a fundamental shift in how to think about clinical documentation tools. Instead of treating structured capture as a necessary evil that must be minimized, we should reimagine it as an opportunity to provide intelligent clinical support at the moment of synthesis.
The Path Forward: Clinician-Centered AI Design
As AI capabilities in healthcare continue to advance, the temptation will be to add more features, more automation, more intelligence. We believe the key to success lies in the opposite direction: ruthless focus on reducing cognitive burden while increasing clinical value.
That means:
Starting every design decision from the clinician's perspective, not the registry's requirements
Making AI assistance invisible until it's needed, then making it frictionless
Measuring success by documentation time reduction, not just data capture rates
Designing for the complexity of real surgical cases, not idealized scenarios
The surgeon's paradox, better documentation requiring more time, isn't inevitable. With thoughtful, clinician-centered design and truly intelligent AI assistance, we can finally deliver on the promise of structured reporting: higher quality data captured in less time, with tools that feel like clinical enhancement rather than administrative burden.
The question isn't whether AI can improve documentation. It's whether we're willing to design AI tools that genuinely serve clinicians, rather than just extract data from them.
At mTuitive, we're building the next generation of surgical documentation and decision support tools with one core principle: technology should amplify clinical excellence, not add friction to it. Learn more about our clinician-centered approach to structured reporting at [mtuitive.com].

