Product

Caption AI | Guidance Software

Role

Product Design Lead
Collaborated closely with PM, Engineers, clinical team, and RA/QA

Caption AI uses artificial intelligence to provide real-time guidance and diagnostic quality assessment of images, empowering healthcare providers—even those without prior ultrasound experience—with the ability to capture diagnostic-quality images.

 

The software is authorized via the De Novo pathway, a regulatory designation reserved for novel technologies, by the FDA in February 2020. This milestone sets the precedent for future AI-based medical device systems and creates guardrails for the safe and effective use of artificial intelligence in the field. It is part of Caption Health's mission to bring healthcare institutions closer to standardizing quality of care by empowering more providers with capabilities that will benefit patients.

The journey to FDA approval wasn't a linear path. I joined Caption Health (then called Bay Labs) in 2016 as their first designer, tasked to create a unique user interface to guide clinicians to successfully perform a specialty diagnostic exam. 

 

Here's the process I led.

DISCOVERY

I immersed myself into the world of echocardiography* and its real-world clinical applications to better understand the field. To wrap my mind around the company's mission to empower clinicians with this capability, I attended a week-long training course for point-of-care ultrasound with internal medicine residents at Abbot Northwestern Hospital in Minneapolis. This provided early hands-on context that was needed to better establish the UX roadmap to help Caption Health achieve its vision.

*A fancy term for ultrasound images of the heart—echo for short.

1/4
INSIGHTS
  • Scanning Protocols: An assumption while building v1 was that a user would scan one view at a time, intentionally stepping through each study. In practice, an ultrasound exam contains multiple views, so the ability to progress through the entire protocol without going back to the main menu was crucial.
     

  • Persistent Instructions: The "tutorial" diagram associated with each view was helpful and necessary, even after a user placed the probe correctly and started scanning. The residents requested for the diagrams to persist onscreen as they optimized for each view.

  • Automate Actions: Onscreen actions that required the user to take their hand off the probe while scanning proved problematic, because of the tendency to slip, lose pressure, or disrupt the view. We needed to consider more ways that the system could automate image recording and progression.
     

  • Skepticism in AI: The algorithm qualitatively underperformed for one view, the inferior vena cava (IVC), which it would mistaken for the aorta in this early version. This instilled a sense of mistrust in the system as a whole, even if all the other views performed well—an important lesson for user adoption. 

DEFINING THE USER & BUSINESS OPPORTUNITIES

In addition to learning about the mindsets and motivations of medical students, the company wanted to expand its use case beyond "training wheels" to help clinical decision-making in critical settings. Our target users broadened to include:

  • Oncologists (shown to the right)

  • ED Physicians

  • Intensivists

  • Hospitalists

  • Anesthesiologists

  • Nurse Practitioners

UX RESEARCH ARTIFACTS

For more user research artifacts like user personas and workflows for each segment that were developed from contextual inquiries and observational studies, please follow this link.

ONCOLOGY WORKFLOW AS-IS VS. TO-BE
1/2

ITERATE & REPEAT

As the product continued evolving and the algorithms kept improving, UX kept up by testing constantly. I often tagged along with the clinical team to medical conferences and product demos to get valuable feedback on user flows and designs.

KEY USER TESTING MILESTONES
  • Q1 2017 | In-House Study: 2 RNs, 2 NPs, 4 sonographers

  • Q2 2017 | In-House Study: 2 RNs, 1 NP, 1 sonographer

  • Q3 2017 | Minneapolist Heart Institute: 3 sonographers

  • June 2018 | ASE Conference: 8 cardiologists, 2 sonographers

  • June 2018 | Stanford Hospital: 2 oncologists, 2 oncology nurses

  • Q3 2018 | In-House Study: 5 RNs

  • Q4 2018 | In-House Study: 5 RNs

  • Jan 2019 | Rwanda: 16 local medical providers

  • March 2019 | Northwestern Hospital: 4 Hospitalists, 4 Physician Assistants, 6 Medical Residents, 5 Certified Medical Assistants 

  • April 2019 | Multi-site Clinical Trial: 9 nurses (scanning performance and clinical endpoint results can be found here)

  • Jan 2020 | Pricing Study in SD, Chicago, NYC: 31 stakeholders

1/6

NEW FEATURES FOR IMPROVING USABILITY

  • Home meter & secondary meter: The quality meter that shows a user whether they're close or far from a diagnostic image is the single most useful feature based on user feedback. To leverage that, we developed a dual guidance meter UI that shows multiple quality meters which helps while obtaining views that are very close to each other. 
     

  • Real-time prescriptive guidance: Another key feature is the turn-by-turn directions that the system gives for users to obtain the most optimal view. The next project in this portfolio dives into the development of this guidance system.
     

  • "Save Best Clip": Through observation, I learned that users could sometimes land in the right position but then lose it, which caused frustration. The system now records all the frames within the study and saves the best loop in the background. Patent pending.
     

  • Capture views in any order: The flexibility to toggle to the entire list of views in the protocol and scan out of order or skip was a significant improvement. This is important in time-sensitive clinical settings where patient variability may hinder the possibility of acquiring a thorough exam.

Categorization and prioritization of feedback heard from user testing.

*I promise, this is the one and only photo of post-its that you will see on this site.

FINAL VISUALS & DESIGN SYSTEM

WHAT'S NEXT
  • Hardware Vendor & Form Factor Exploration: Caption AI is currently integrated with one ultrasound hardware vendor, with plans to establish partnerships across multiple hardware partners in both traditional cart-based and handheld devices.
     

  • Additional Automated Interpretations: A clinically-validated Ejection Fraction estimate is the company's first cleared interpretation algorithm, and part of Caption AI's product offering. Development for more automated interpretations for valvular and volume assessments are underway.

  • Scalable Training Program: For rolling out the product to multiple healthcare institutions across the U.S., I am part of the team in developing a training program to validate user readiness.
     

  • Expansion to Other Organs: Cardiac imaging is one of the trickiest specialties in sonography because the heart is a moving structure. Now that we have attained clearance for echocardiography, the company plans to expand the guidance system to other organs such as lung, abdominal, and obsetrics. 

ching.hsieh@gmail.com​  |  (408) 621-4486

© 2020 by Ching Hsieh. Made with caffeine-fueled optimism in San Francisco.