Regional cerebral oxygen saturation (rSO), a method of cerebral tissue oxygenation measurement, is recorded using non-invasive near-infrared Spectroscopy (NIRS) devices. A major limitation is that recorded signals often contain artifacts. Manually removing these artifacts is both resource and time consuming. The objective was to evaluate the applicability of using wavelet analysis as an automated method for simple signal loss artifact clearance of rSO signals obtained from commercially available devices. A retrospective observational study using existing populations (healthy control (HC), elective spinal surgery patients (SP), and traumatic brain injury patients (TBI)) was conducted. Arterial blood pressure (ABP) and rSO data were collected in all patients. Wavelet analysis was determined to be successful in removing simple signal loss artifacts using wavelet coefficients and coherence to detect signal loss artifacts in rSO signals. The removal success rates in HC, SP, and TBI populations were 100%, 99.8%, and 99.7%, respectively (though it had limited precision in determining the exact point in time). Thus, wavelet analysis may prove to be useful in a layered approach NIRS signal artifact tool utilizing higher-frequency data; however, future work is needed.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11154537 | PMC |
http://dx.doi.org/10.3390/bioengineering11010033 | DOI Listing |
Sci Rep
December 2024
Shandong Agricultural University, Taian, 271018, China.
Acoustic emission information can describe the damage degree of rock samples in the process of failure. However, as a discrete non-stationary signal, acoustic emission information is difficult to be effectively processed by conventional methods, while wavelet analysis is an effective method for non-stationary signal processing. Therefore, acoustic emission signal is deeply studied by using wavelet analysis method.
View Article and Find Full Text PDFSci Rep
December 2024
Henan College of Transportation, Zhengzhou, 450000, Henan, China.
Novel Human Activity Recognition (HAR) methodologies, which are built upon learning algorithms and employ ubiquitous sensors, have achieved remarkable precision in the identification of sports activities. Such progress benefits all age groups of humanity, and in the future, AI will be used to address difficult problems in scientific research. A novel approach is introduced in this article to utilize motion sensor data in order to categorize and distinguish various categories of sports activities.
View Article and Find Full Text PDFIran Biomed J
December 2024
Student Research Committee, Department of Nursing, Kashan Branch, Islamic Azad University, Kashan, Iran.
J Otol
July 2024
College of Otolaryngology Head and Neck Surgery, Chinese PLA General Hospital, Chinese PLA Medical School, 28 Fuxing Road, Beijing, China.
Objective: This study aimed to develop and evaluate a novel software tool for robust analysis of the Visually Enhanced Vestibular-Ocular Reflex (VVOR) and video head impulse test (vHIT) saccades.
Methods: A retrospective study was conducted on 94 patients with Meniere's Disease (MD), unilateral vestibular hypofunction (UVH), and vestibular migraine (VM). The MATLAB-based VVOR Analysis System and Saccades All in One software were utilized for data processing.
J Environ Manage
December 2024
School of Design, Shanghai Jiao Tong University, Shanghai, 200240, China. Electronic address:
This study delves into the multi-scale temporal and spatial variations of soil heat flux (G) within riparian zones and its correlation with net radiation (Rn) across six riparian woodlands in Shanghai, each characterized by distinct vegetation types. The objective is to assess the complex interrelations between G and Rn, and how these relationships are influenced by varying vegetation and seasons. Over the course of a year, data on G and Rn is collected to investigate their dynamics.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!