Artificial neural networks (ANN) are constructed to simulate processes of the central nervous system of higher creatures. An ANN consists of a set of processing units (nodes) which simulate neurons and are interconnected via a set of "weights" (analogous to synaptic connections in the nervous system) in a way which allows signals to travel through the network in parallel. The nodes (neurons) are simple computing elements. They accumulate input from other neurons by means of a weighted sum. If a certain threshold is reached the neuron sends information to all other connected neurons otherwise it remains quiescent. One major difference compared with traditional statistical or rule-based systems is the learning aptitude of an ANN. At the very beginning of a training process an ANN contains no explicit information. Then a large number of cases with a known outcome are presented to the system and the weights of the inter-neuronal connections are changed by a training algorithm designed to minimise the total error of the system. A trained network has extracted rules that are represented by the matrix of the weights between the neurons. This feature is called generalisation and allows the ANN to predict cases that have never been presented to the system before. Artificial neural networks have shown to be useful predicting various events. Especially complex, non-linear, and time depending relationships can be modelled and forecasted. Furthermore an ANN can be used when the influencing variables on a certain event are not exactly known as it is the case in financial or weather forecasts. This article aims to give a short overview on the function of ANN and their previous use and possible future applications in anaesthesia, intensive care, and emergency medicine.

Download full-text PDF

Source
http://dx.doi.org/10.1007/s00101-003-0576-xDOI Listing

Publication Analysis

Top Keywords

neural networks
12
intensive care
8
care emergency
8
artificial neural
8
nervous system
8
presented system
8
ann
7
system
5
neurons
5
[artificial neural
4

Similar Publications

Speech Technology for Automatic Recognition and Assessment of Dysarthric Speech: An Overview.

J Speech Lang Hear Res

January 2025

Centre for Language Studies, Radboud University, Nijmegen, the Netherlands.

Purpose: In this review article, we present an extensive overview of recent developments in the area of dysarthric speech research. One of the key objectives of speech technology research is to improve the quality of life of its users, as evidenced by the focus of current research trends on creating inclusive conversational interfaces that cater to pathological speech, out of which dysarthric speech is an important example. Applications of speech technology research for dysarthric speech demand a clear understanding of the acoustics of dysarthric speech as well as of speech technologies, including machine learning and deep neural networks for speech processing.

View Article and Find Full Text PDF

Intracranial atherosclerotic stenosis (ICAS) and intracranial aneurysms are prevalent conditions in the cerebrovascular system. ICAS causes a narrowing of the arterial lumen, thereby restricting blood flow, while aneurysms involve the ballooning of blood vessels. Both conditions can lead to severe outcomes, such as stroke or vessel rupture, which can be fatal.

View Article and Find Full Text PDF

Background: Hemorrhagic transformation (HT) is a complication of reperfusion therapy following acute ischemic stroke (AIS). We aimed to develop and validate a model for predicting HT and its subtypes with poor prognosis-parenchymal hemorrhage (PH), including PH-1 (hematoma within infarcted tissue, occupying < 30%) and PH-2 (hematoma occupying ≥ 30% of the infarcted tissue)-in AIS patients following intravenous thrombolysis (IVT) based on noncontrast computed tomography (NCCT) and clinical data.

Methods: In this six-center retrospective study, clinical and imaging data from 445 consecutive IVT-treated AIS patients were collected (01/2018-06/2023).

View Article and Find Full Text PDF

Unlabelled: This study utilized deep learning for bone mineral density (BMD) prediction and classification using biplanar X-ray radiography (BPX) images from Huashan Hospital Medical Checkup Center. Results showed high accuracy and strong correlation with quantitative computed tomography (QCT) results. The proposed models offer potential for screening patients at a high risk of osteoporosis and reducing unnecessary radiation and costs.

View Article and Find Full Text PDF

A Serial MRI-based Deep Learning Model to Predict Survival in Patients with Locoregionally Advanced Nasopharyngeal Carcinoma.

Radiol Artif Intell

January 2025

From the Department of Radiation Oncology, State Key Laboratory of Oncology in South China, Guangdong Key Laboratory of Nasopharyngeal Carcinoma Diagnosis and Therapy, Guangdong Provincial Clinical Research Center for Cancer, Sun Yat-sen University Cancer Center, 651 Dongfeng Road East, Guangzhou 510060, P. R. China (J.K., C.F.W., Z.H.C., G.Q.Z., Y.Q.W., L.L., Y.S.); Department of Radiation Therapy, Nanhai People's Hospital, The Sixth Affiliated Hospital, South China University of Technology, Foshan, China (J.Y.P., L.J.L.); and Department of Electronic Engineering, Information School, Yunnan University, Kunming, China (W.B.L.).

Purpose To develop and evaluate a deep learning-based prognostic model for predicting survival in locoregionally- advanced nasopharyngeal carcinoma (LA-NPC) using serial MRI before and after induction chemotherapy (IC). Materials and Methods This multicenter retrospective study included 1039 LA-NPC patients (779 male, 260 female, mean age 44 [standard deviation: 11]) diagnosed between April 2009 and December 2015. A radiomics- clinical prognostic model (Model RC) was developed using pre-and post-IC MRI and other clinical factors using graph convolutional neural networks (GCN).

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!