Coherence analysis characterizes frequency-dependent covariance between signals, and is useful for multivariate oscillatory data often encountered in neuroscience. The global coherence provides a summary of coherent behavior in high-dimensional multivariate data by quantifying the concentration of variance in the first mode of an eigenvalue decomposition of the cross-spectral matrix. Practical application of this useful method is sensitive to noise, and can confound coherent activity in disparate neural populations or spatial locations that have a similar frequency structure. In this paper we describe two methodological enhancements to the global coherence procedure that increase robustness of the technique to noise, and that allow characterization of how power within specific coherent modes change through time.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3293406 | PMC |
http://dx.doi.org/10.1109/IEMBS.2011.6091170 | DOI Listing |
Entropy (Basel)
January 2025
School of Integrated Circuits and Electronics, Beijing Institute of Technology, Beijing 100081, China.
Optical Coherence Tomography (OCT) is a crucial imaging modality for diagnosing and monitoring retinal diseases. However, the accurate segmentation of fluid regions and lesions remains challenging due to noise, low contrast, and blurred edges in OCT images. Although feature modeling with wide or global receptive fields offers a feasible solution, it typically leads to significant computational overhead.
View Article and Find Full Text PDFTransl Vis Sci Technol
January 2025
Department of Biomedical Engineering, Faculty of Engineering, Mahidol University, Nakhon Pathom, Thailand.
Purpose: The purpose of this study was to develop a deep learning approach that restores artifact-laden optical coherence tomography (OCT) scans and predicts functional loss on the 24-2 Humphrey Visual Field (HVF) test.
Methods: This cross-sectional, retrospective study used 1674 visual field (VF)-OCT pairs from 951 eyes for training and 429 pairs from 345 eyes for testing. Peripapillary retinal nerve fiber layer (RNFL) thickness map artifacts were corrected using a generative diffusion model.
Commun Eng
January 2025
Department of Electrical and Computer Engineering, University of North Carolina at Charlotte, Charlotte, NC, USA.
Vision impairment affects nearly 2.2 billion people globally, and nearly half of these cases could be prevented with early diagnosis and intervention-underscoring the urgent need for reliable and scalable detection methods for conditions like diabetic retinopathy and age-related macular degeneration. Here we propose a distributed deep learning framework that integrates self-supervised and domain-adaptive federated learning to enhance the detection of eye diseases from optical coherence tomography images.
View Article and Find Full Text PDFQuant Imaging Med Surg
January 2025
Department of Ophthalmology, the Fourth Affiliated Hospital of China Medical University, Shenyang, China.
Background: Recently, deep learning has become a popular area of research, and has revolutionized the diagnosis and prediction of ocular diseases, especially fundus diseases. This study aimed to conduct a bibliometric analysis of deep learning in the field of ophthalmology to describe international research trends and examine the current research directions.
Methods: This cross-sectional bibliometric analysis examined the development of research on deep learning in the field of ophthalmology and its sub-topics from 2015 to 2024.
BMC Res Notes
January 2025
Department of Health Information Technology and Management, School of Allied Medical Sciences, Shahid Beheshti University of Medical Sciences, Tehran, Iran.
Objective: Glaucoma is a major cause of irreversible blindness globally. Optical coherence tomography (OCT) aids early glaucoma diagnosis. Interpreting OCT scans requires familiarity with the technology and image analysis.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!