Docket #: S20-050
Assessing Cardiac Function from Ultrasound Videos Using Deep Learning Algorithms
Stanford researchers have developed an algorithm using deep learning architectures to predict cardiac function (ejection fraction) and trace the endocardium of the left ventricle from ultrasound echocardiogram videos. Current methods rely on human interpretation of ultrasound videos, leading to assessment variability. This algorithm assesses cardiac function faster, more accurately, and reliably than humans.
Stage of Research
Applications
- Algorithmic assessment of cardiac function (ejection fraction) across multiple heart beats from ultrasound videos
Advantages
- Automated and efficient
- Performance
- Segments left ventricle with Dice Similarity Coefficient of 0.92
- Predicts ejection fraction with mean absolute error of 4.1%
- Reliably classifies heart failure with reduced ejection fraction with AUC of 0.97
Publications
- D. Ouyang et al Video-based AI for beat-to-beat assessment of cardiac function Nature March 25, 2020.
Related Links
Patents
- Published Application: 20210304410
- Issued: 11,704,803 (USA)
Similar Technologies
-
TrueImage: Better Images for Telemedicine S21-262TrueImage: Better Images for Telemedicine
-
State-of-the-Art Graph Diffusion Transformer for Natural Language Processing S20-271State-of-the-Art Graph Diffusion Transformer for Natural Language Processing
-
Using Supervised and Unsupervised Learning to Infer Diagnostic Codes on Veterinary Clinical Text S19-041Using Supervised and Unsupervised Learning to Infer Diagnostic Codes on Veterinary Clinical Text