Get Access


We are currently in private access.
Meet with us virtually to:

included icon
Request access
included icon
See a live demo
included icon
Ask us anything

ULTRA-metrics

Ultrasound competency metrics

ULTRA-metrics is a framework that measures experience, skills, and autonomy for any ultrasound user and application.

Delphi-Derived with diverse ultrasound experts globally

Alberto Goffi
Emma Chung
Suean Pascoe
Jessica Solis-McCarthy
Mark Foster
Chris Gelabert
Emanuele Pivetta
Colin Bell
Erica Clarke Whalen
Janeve Desy
Mike Smith
Simon Hayward
Hannah Latta
Hayley Israel
Marcus Peck
Adrian Wong
Tanping Wong
Chris Yap
[Anonymous]
Framework Developer: contributed original metrics or concepts to the framework

Integrates commonly used scoring metrics

ACEP 5 Point Scan Quality Scale
Modified Ottawa Entrustability Score

* Peer-review in progress

Framework

Design principles, key components and how they work together.

Metrics

Core metrics and criteria for data collection.

Toolkit

Tools and guides for implementing ULTRA-metrics.

Data Registry

Competency curves, innnovation benchmarks, and user feedback.

Why ULTRA-metrics was created

To grow ultrasound capacity by addressing four areas of need.

1) Guideline variability

Competency guidelines vary widely and overly rely on exam counts rather than direct measures of skill or autonomy.

2) Expert capacity

Expert shortages limit ultrasound training capacity, yet learners are typically held to fixed supervision and exam requirements regardless of how quickly they progress.

3) Innovation validation

Promising tools like virtual mentorship and AI could expand training capacity, but slow validation hinders their adoption.

4) Multi-center research

Inconsistent competency metrics across programs make it difficult to benchmark performance or conduct meaningful multicenter research.

Framework Overview

Designed for any ultrasound program - focused on four design principles.

1) Holistic Measurement

Measure competency holistically (beyond exam counts) with additional metrics to quantify experience, autonomy, and skills.

2) Broad Adoptability

Ensure wide applicability across ultrasound applications, regions, and training programs by using simple, robust, and generalizable measurements.

3) Cross-Program Interoperabiltiy

Facilitate comparison of datasets through development of framework common data elements (CDEs).

4) Versionable Modules

Ensure the framework is modular so it can evolve over time as new ultrasound competency metrics and training methods become available.

Simple and Flexible

Only 3 metrics and 6 metadata items are required start - optional add-ons available as needed.

Metric Overview

Explore core metrics below - the toolkit allows you to download the full list of metrics including metadata and skill trees.

Competency Domain

Experience

Quantity of ultrasound exams performed and diversity of clinical scenarios.

Metric

Exam Counts

Number of exams performed of each type (i.e. Cardiac, Lung, etc.)

Metric (optional)

Findings Counts

Number of findings interpreted by exam type (i.e. Lung B-lines)

Competency Domain (optional)

Automony

Degree of independence performing ultrasound.

Metric (optional)

Entrustability

Trust in a learner's readiness to perform certain clinical tasks independently based on demonstrated competence.

Answer Set

Modified Ottawa Entrustability Score
Score
CLassifier
Criteria
1
Dependent
Supervisor did it: trainee required complete guidance or was unprepared; supervisor had to do most of the work
2
Guided
Supervisor talked through it: trainee was able to perform some tasks but required repeated directions
3
Prompted
Supervisor needed to prompt: trainee demonstrated some independence and only required intermittent prompting
4
Monitored
Supervisor needed to be there just in case: trainee functioned fairly independently and only needed assistance with nuances or complex situations
5
Independent
Supervisor did not need to be there

Competency Domain

Skills

Level of skills performing ultrasound.

Measured across specific skill-domains.

Skill Domain (optional)

Indication

Assessing if the ultrasound exam is appropriate based on the patient’s history, symptoms, and other findings, with a clear clinical question to guide its use.

Metric (optional)

Reasoning

Can the practitioner articulate a focused clinical question that ultrasound can address for this patient?

Answer Set

3 Point Scale, Binary Scale
Score
CLassifier
Description
1
Poor
No relevant clinical question formulated
3
Adequate
Relevant clinical question formulated, but it lacks specificity to the patient's condition
5
Ideal
Minimal criteria met for diagnosis, all structures imaged with excellent image quality

Skill Domain

Acquisition

Acquiring and optimizing ultrasound images.

Metric

Quality and Completeness

Were the acquired ultrasound scans of enough completeness and quality to answer the clinical question?

Two options available to capture this metric.

Option 1

Quality and Completeness Together

Uses a common scoring method from North America; however may not account for scenarios of high quality but low completeness and vise-versa.

Answer Set

ACEP 5 Point Scale
Score
CLassifier
Description
1
Poor
No recognizable structures
2
Fair
Minimally recognizable structures but insufficient for diagnosis
3
Adequate
Minimal criteria met for diagnosis, recognizable structures but with some technical or other flaws
4
Good
Minimal criteria met for diagnosis, all structures imaged well
5
Ideal
Minimal criteria met for diagnosis, all structures imaged with excellent image quality

Option 2

Quality and Completeness Seperate

Enables independent measurement of quality and completeness.

Answer Set

Completeness 3 point scale
Score
CLassifier
Description
1
Poor
One or more required scan views missing
3
Adequate
Required scan views are complete, one or more optional scan views missing
5
Ideal
Required and optional scan views are complete

Answer Set

Quality 3 point scale
Score
CLassifier
Description
1
Poor
Scans do not allow for accurate interpretation
3
Adequate
Scans allow for accurate interpretation, but scan quality could be improved
5
Ideal
Scans allow for accurate interpretation with excellent diagnostic scan quality

Skill Domain

Interpretation

Distinguishing normal from abnormal findings.

Metric

Accuracy

Were relevant findings and pathology accurately interpreted?

Answer Set

3-point scale
Score
CLassifier
Description
1
Poor
Major inaccuracies
3
Adequate
Minor inaccuracies only
5
Ideal
No minor or major inaccuracies

Skill Domain (optional)

Clinical Integration

Integrating ultrasound findings into clinical management.

Metric (optional)

Appropriateness

Were ultrasound findings appropriately integrated into clinical decision-making?

Answer Set

3-point scale, binary scale
Score
CLassifier
Description
1
Poor
Fails to appropriately apply ultrasound findings toclinical decisions
3
Adequate
Applies ultrasound findings appropriately butmay overlook subtle details or additional relevantinformation
5
Ideal
Appropriately incorporates all ultrasound findingsinto clinical decisions without errors or omissions

Implementation Toolkit

Capture and monitor ULTRA-metrics with your ultrasound program

Three tools to make implementation a breeze - compatable with almost any assessment workflow

Common Data Elements

Full list of metrics, answer sets, score criteria, metadata, skill trees, and which ones are required vs. optional.

Software Platform

Capture, monitor, and analyze ULTRA-metrics with a web-based software platform.

Data Capture Guide

Real-time, virtual async, and retrospective data capture workflows with data privacy guidance.

The simplest way to get started

Presuna software has everything you need

Data Capture

Mobile and web-based software for simple integration with your assessment workflows.

Metric Monitoring

Cohort and user based analytics with longitduinal and snapshot views.

Data Compliance

Comply with GDPR, HIPAA, and any other data privacy regulation.

Scan Archiving

Use alongside your scan archiving software, or use Presuna as your scan archive.

Data Registry

Datasets about ULTRA-metrics and ultrasound competency

Collected by ultrasound programs and researchers

Datasets

User Feedback

Quantitative survey responses from trainees, reviewers, and administrators on the usability and value of ULTRA-metrics in various contexts.

Competency Curves

Anonymized longitudinal cohort data mapping exam counts to ULTRA-metrics scores across various cohorts.

Innovation Benchmarks

Competency curves quantifying the impact of training innovations like virtual mentorship and artificial intelligence.

Seeking Dataset Contributors

The Data Registry aims to improve competency guidelines, share best practices across ultrasound programs, and accelerate training innovations.

We're actively seeking contributors to grow the Data Registry.
Once it reaches a useable initial size, we'll grant public and private access to it.

To learn more or contribute, click "Get Access" below.