According to a study a form of artificial intelligence can analyze heart ultrasound tests than more quickly and better than board-certified echocardiographers. Research analyst at the University of California trained a computer to assess the most common echocardiogram views and tested them against skilled human technicians. The researchers used 180,294 real-world echo images for the advanced machine learning, finding that the computers precisely assessed EKG videos 91.7 percent to 97.8 percent of the time, compared to 70.2 percent to 83.5 percent when humans reviewed them.
As results suggested, this approach can be used in helping echocardiographers in improving their efficiency, accuracy, and workflow. It can also provide a foundation for better analysis of echocardiographic data. In an echo, numerous video clips, still images and heart recordings are measured from more than a dozen different angles, or views, several of which may have only subtle differences. Interpreting medical images, including echocardiograms, typically requires extensive training.
Although deep learning has been used to detect abnormalities for pathology, radiology, dermatology and other fields, it hasn’t been widely applied to echocardiograms. This is because of the complexity of their multi-view, multi-modality format. As compared to the earlier machine learning process, which has been applied to echocardiography, the adaptability of training in deep learning has more significant advantage.
Researchers used images from UCSF Medical Center’s patients aged 20-96. Eighty-percent were used for training, while the rest were used for validation and testing. Each board-certified echocardiographer participating in the study was given randomly selected images. The researchers also found that the file size could be reduced without losing accuracy, allowing for less storage space and easier transmission. They accomplished this by removing color and standardizing the sizes and shapes of videos and still images.