Joint iQ

An educational platform for non-expert users to diagnose hip and knee arthritis.
Roles
Lead Designer
Researcher
Skills
UX Design
Instructional Design
UX Research
Brand Design
Collaborators
Product Manager: DJ Vaglia
Project Manager: James Bucki
Software Engineer: Zach Kaigler
Medical Director: Dr. Tony DiGioia, MD

Background on our Mission

Joint iQ’s vision is to provide an educational resource on accurately diagnosing and treating knee and hip arthritis for non-orthopaedic clinical providers or those training for Orthopaedic practice.
Potential systemic impacts in clinic work flows :
  • Avoid or defer 15-30% of hip and knee replacement surgeries
  • Decrease the overall cost of uncoordinated patient care pathways by 10x.
  • Reduce a specialist appointment wait time by 1/2 in high volume clinics.

Problem

The beta version of the Education flow was built and designed by engineers and our medical director, and I was brought in to research and improve the existing user experience of the overall flow. Initial flow and features include:
  • General knee and hip arthritis education modules
  • Introduction to how our ML algorithm can assist with reading X-rays
  • Patient evaluation simulation with x-ray uploads and survey entries.
  • Machine Learning assisted evaluations

Research

10 participants with healthcare backgrounds took part of an extensive observation use case study. The job titles of the participants include a mix of nurse practitioners, data analysts, physicians assistants and surgery schedulers.
Methods of Research:
  • Observation of product in use
  • Think-out-loud
  • Machine Learning assisted evaluations
After having users complete the education and training modules once from start to end, we had users map out their ideal structure of the education. We did a whiteboard sketch as they described it to help visualize their flow. Below are some notable comments from our users:
“Inputting all this patient demographic data is so time consuming and doesn’t contribute to my learning.”
“I dislike the process of having to download sample X-rays onto my computer and upload them again.”
“What is the difference between training 1 and 2?”

Target Personas

Our research helped us identify the key traits, goals and educational needs of our 2 target audience groups that would benefit most from Joint iQ:
New Orthopedic Providers
Key Traits:
  • Overwhelmed on the first week of the job
  • Often shadowing other in-clinic providers
  • Defers a lot of questions to surgeons or PAs
Goals with Education:
  • Foundational skills training for new ortho-providers
  • Simulate in-clinic patient triaging as close as possible
  • Brush up X-ray reading skills
Learning Preferences:
  • Soft learning curve
  • Repeatability in training exercises to build confidence
  • Easily accessible
Non-Orthopedic Providers
Key Traits:
  • Very busy, constantly seeing patients back to back throughout shift
  • Lack of patience
  • Efficient workflow is necessary
Goals with Education:
  • Gain a high-level understanding of bone and joint health
  • Simple knee and hip arthritis triaging
  • Quick care pathway determination
Learning Preferences:
  • Low time commitment
  • Easily accessible
  • Self-paced learning

Key Takeaways

Users completed all 4 modules from beginning to end under the allotted test session time frame of 1 hour.
Found there was unnecessary repetition and data entry between the 2 training modules.
Thinks the existing brand language, layout, information hierarchy is a visually disengaging experience.
Struggled to keep track of their progress and performance as they completed the Testing module.
How might we minimize drop-off points?

1

Create a cohesive brand experience.

2

Eliminate redundant touch points in the Training modules.

3

Implement a feedback system for better performance tracking.

Improved Features

1

Create a cohesive brand experience.
X-ray Evaluation Page Old Design:
Joint iQ (Step by Step) Home Page Old Design:
Establishing a New Brand Language
New Design:
  • Instruction is shorter and easier to read.
  • Patient X-ray and Comparison X-ray labels are centered on top of the X-rays to make it more apparent.
  • Removed AI prediction tag, when users first arrive at this page, nothing is selected.
  • Added a step by step walkthrough guide on under the "Help" icon button.

2

Eliminate redundant touch points in the Training modules.
70% of users found the process of entering patient data, downloading and uploading x-rays to be useless to their learning and a huge time sink.
Joint iQ Education V1 Module Flow Map (Old)
Joint iQ Education V2 Module Flow Map (New)
  • Training 1 and 2 combined to single training module with a care pathway selection exercise.
  • Two modules have same flows in the first half - repetition is good to reinforce concepts learned.
  • Added a step by step walkthrough guide on under the "Help" icon button.
Instead of having users copy pasting patient info from an external source, we designed patient personas that represent common chief complaints seen at an orthopedic clinic.
In the old design, users were required to externally download a PDF that had the patient persona’s responses to the survey and copy the responses from the PDF. Testers have found this step unnecessary as it didn’t benefit their learning objective of x-ray evaluations in any way.
Instead of having users downloading a ZIP file of x-rays and re-uploading the x-rays into the system, the new design features an x-ray selection screen that only requires users to select the appropriate x-ray views for the patient. A lot of clinical worker testers struggled with externally downloading, locating and uploading files as they do not usually do this in their daily workflow.

3

Implement a feedback system for better performance tracking.
90% of users struggled to keep track of their progress and performance as they completed the Testing module.
Old “Report Card” Screen Design
  • Contrast between the wrong and right answers are not clear enough.
  • Users found this information to be good to review in the future but there is no visual indicator that keeps track of their progress and errors.
New "Report Card" Designs
We focused on implementing a scoring system after important exercises such as X-ray Evaluation and Care Pathway Selection:
Added a percentage score breakdown of how well you did on each "patient" in main navigation page for the test your knowledge section. The score allows users to quickly determine which joints they may need more practice on and viewing the score breakdown gives a detailed of which exercises were evaluated incorrectly (see "Overall Evaluation Report Card" above).
Old Care Pathway Selection Exercise Feedback Design
Implementing better feedback on explaining the wrong answer that encourages users to understand the process to getting the correct answer.