In order to realize the visual analysis of cardiac fluid motion, according to the characteristics of cardiac flow field ultrasound image, a method for the cardiac Vector Flow Mapping (VFM) analysis and evaluation based on the You-Only-Look-Once (YOLO) deep learning model and the improved two-dimensional continuity equation is proposed in this paper. Firstly, based on the ultrasound Doppler data, the radial velocity values of the blood particles are obtained; due to the real-time VFM's high requirement on the computing speed, the YOLO deep learning model is combined with an improved block matching algorithm for the localization and tracking of myocardial wall, and then the azimuth velocity of myocardial wall speckles can be obtained; in addition, it is proposed in this paper to use a nonlinear weight function to fuse the radial velocity of the blood particles and azimuth velocity of myocardial wall speckles nonlinearly, and further the vortex streamline diagram in the cardiac flow field can be obtained. The results of the experiments on the evaluation of the Ultrasonic apical long-axis view show that the proposed method not only improves the accuracy of VFM, but also provides a new evaluation basis for cardiac function impairment.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.compmedimag.2020.101732 | DOI Listing |
Sci Rep
January 2025
Alliance of Bioversity International and International Center for Tropical Agriculture (CIAT), Km 17 Recta Cali-Palmira, Cali, Colombia.
Bananas (Musa spp.) are a critical global food crop, providing a primary source of nutrition for millions of people. Traditional methods for disease monitoring and detection are often time-consuming, labor-intensive, and prone to inaccuracies.
View Article and Find Full Text PDFSci Rep
January 2025
School of Air Traffic Management, Civil Aviation Flight University of China, Guanghan, 618300, China.
To address the challenges of high computational complexity and poor real-time performance in binocular vision-based Unmanned Aerial Vehicle (UAV) formation flight, this paper introduces a UAV localization algorithm based on a lightweight object detection model. Firstly, we optimized the YOLOv5s model using lightweight design principles, resulting in Yolo-SGN. This model achieves a 65.
View Article and Find Full Text PDFComput Biol Med
January 2025
Department of Mathematics and Computer Science, University of Cagliari, Via Ospedale 72, 09124, Cagliari, Italy.
Background: Malaria is a critical and potentially fatal disease caused by the Plasmodium parasite and is responsible for more than 600,000 deaths globally. Early and accurate detection of malaria parasites is crucial for effective treatment, yet conventional microscopy faces limitations in variability and efficiency.
Methods: We propose a novel computer-aided detection framework based on deep learning and attention mechanisms, extending the YOLO-SPAM and YOLO-PAM models.
Sensors (Basel)
January 2025
Engineering Training Center, Nantong University, Nantong 226019, China.
The issue of obstacle avoidance and safety for visually impaired individuals has been a major topic of research. However, complex street environments still pose significant challenges for blind obstacle detection systems. Existing solutions often fail to provide real-time, accurate obstacle avoidance decisions.
View Article and Find Full Text PDFSensors (Basel)
January 2025
Dalian Naval Academy Cadet Brigade, Dalian 116000, China.
Mesoscale eddies are pivotal oceanographic phenomena affecting marine environments. Accurate and stable identification of these eddies is essential for advancing research on their dynamics and effects. Current methods primarily focus on identifying Cyclonic and Anticyclonic eddies (CE, AE), with anomalous eddy identification often requiring secondary analyses of sea surface height anomalies and eddy center properties, leading to segmented data interpretations.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!