Docket #: S20-050
Assessing Cardiac Function from Ultrasound Videos Using Deep Learning Algorithms
Stanford researchers have developed an algorithm using deep learning architectures to predict cardiac function (ejection fraction) and trace the endocardium of the left ventricle from ultrasound echocardiogram videos. Current methods rely on human interpretation of ultrasound videos, leading to assessment variability. This algorithm assesses cardiac function faster, more accurately, and reliably than humans.
Stage of Research
Applications
- Algorithmic assessment of cardiac function (ejection fraction) across multiple heart beats from ultrasound videos
Advantages
- Automated and efficient
- Performance
- Segments left ventricle with Dice Similarity Coefficient of 0.92
- Predicts ejection fraction with mean absolute error of 4.1%
- Reliably classifies heart failure with reduced ejection fraction with AUC of 0.97
Publications
- D. Ouyang et al Video-based AI for beat-to-beat assessment of cardiac function Nature March 25, 2020.
Related Links
Patents
- Published Application: 20210304410
- Issued: 11,704,803 (USA)
Similar Technologies
-
Multimodal DAC Microendoscope Platforms S10-278Multimodal DAC Microendoscope Platforms
-
Focused Ultrasound to enhance function and engraftment of pancreatic islets following transplantation S19-089Focused Ultrasound to enhance function and engraftment of pancreatic islets following transplantation
-
3D printed smartphone lens adapters for mobile anterior and posterior segment ophthalmoscopy S13-1953D printed smartphone lens adapters for mobile anterior and posterior segment ophthalmoscopy