This study presents a data-driven battery emulator using long short-term memory deep learning models to predict the charge-discharge behaviour of lithium-ion batteries (LIBs). This study aimed to reduce the economic costs and time associated with the fabrication of large-scale automotive prototype batteries by emulating their performance using smaller laboratory-produced batteries. Two types of datasets were targeted: simulation data from the Dualfoil model and experimental data from liquid-based LIBs. These datasets were used to accurately predict the voltage profiles from the arbitrary inputs of various galvanostatic charge-discharge schedules. The results demonstrated high prediction accuracy, with the coefficient of determination scores reaching 0.98 and 0.97 for test datasets obtained from the simulation and experiments, respectively. The study also confirmed the significance of state-of-charge descriptors and inferred that a robust model performance could be achieved with as few as five charge-discharge training datasets. This study concludes that data-driven emulation using machine learning can significantly accelerate the battery development process, providing a powerful tool for reducing the time and economic costs associated with the production of large-scale prototype batteries.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11582594 | PMC |
http://dx.doi.org/10.1038/s41598-024-80371-9 | DOI Listing |
Comput Methods Programs Biomed
January 2025
Shanghai Maritime University, Shanghai 201306, China. Electronic address:
Background And Objective: Inferring large-scale brain networks from functional magnetic resonance imaging (fMRI) provides more detailed and richer connectivity information, which is critical for gaining insight into brain structure and function and for predicting clinical phenotypes. However, as the number of network nodes increases, most existing methods suffer from the following limitations: (1) Traditional shallow models often struggle to estimate large-scale brain networks. (2) Existing deep graph structure learning models rely on downstream tasks and labels.
View Article and Find Full Text PDFBrief Bioinform
November 2024
Hubei Provincial Key Laboratory of Artificial Intelligence and Smart Learning, Central China Normal University, Wuhan 430079, China.
Identifying phage-host interactions (PHIs) is a crucial step in developing phage therapy, which is the promising solution to addressing the issue of antibiotic resistance in superbugs. However, the lifestyle of phages, which strongly depends on their host for life activities, limits their cultivability, making the study of predicting PHIs time-consuming and labor-intensive for traditional wet lab experiments. Although many deep learning (DL) approaches have been applied to PHIs prediction, most DL methods are predominantly based on sequence information, failing to comprehensively model the intricate relationships within PHIs.
View Article and Find Full Text PDFAdv Sci (Weinh)
January 2025
DP Technology, Beijing, 100080, China.
Powder X-ray diffraction (PXRD) is a prevalent technique in materials characterization. While the analysis of PXRD often requires extensive human manual intervention, and most automated method only achieved at coarse-grained level. The more difficult and important task of fine-grained crystal structure prediction from PXRD remains unaddressed.
View Article and Find Full Text PDFAdv Sci (Weinh)
January 2025
School of Pharmacy, Sungkyunkwan University, Suwon, 16419, Republic of Korea.
β-secretase (BACE1) is instrumental in amyloid-β (Aβ) production, with overexpression noted in Alzheimer's disease (AD) neuropathology. The interaction of Aβ with the receptor for advanced glycation endproducts (RAGE) facilitates cerebral uptake of Aβ and exacerbates its neurotoxicity and neuroinflammation, further augmenting BACE1 expression. Given the limitations of previous BACE1 inhibition efforts, the study explores reducing BACE1 expression to mitigate AD pathology.
View Article and Find Full Text PDFSci Rep
January 2025
North Carolina School of Science and Mathematics, Durham, NC, 27705, USA.
Mobile Ad Hoc Networks (MANETs) are increasingly replacing conventional communication systems due to their decentralized and dynamic nature. However, their wireless architecture makes them highly vulnerable to flooding attacks, which can disrupt communication, deplete energy resources, and degrade network performance. This study presents a novel hybrid deep learning approach integrating Convolutional Neural Networks (CNN) with Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU) architectures to effectively detect and mitigate flooding attacks in MANETs.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!