The accurate prediction of drug-target binding affinity (DTA) is an essential step in drug discovery and drug repositioning. Although deep learning methods have been widely adopted for DTA prediction, the complexity of extracting drug and target protein features hampers the accuracy of these predictions. In this study, we propose a novel model for DTA prediction named MSGNN-DTA, which leverages a fused multi-scale topological feature approach based on graph neural networks (GNNs). To address the challenge of accurately extracting drug and target protein features, we introduce a gated skip-connection mechanism during the feature learning process to fuse multi-scale topological features, resulting in information-rich representations of drugs and proteins. Our approach constructs drug atom graphs, motif graphs, and weighted protein graphs to fully extract topological information and provide a comprehensive understanding of underlying molecular interactions from multiple perspectives. Experimental results on two benchmark datasets demonstrate that MSGNN-DTA outperforms the state-of-the-art models in all evaluation metrics, showcasing the effectiveness of the proposed approach. Moreover, the study conducts a case study based on already FDA-approved drugs in the DrugBank dataset to highlight the potential of the MSGNN-DTA framework in identifying drug candidates for specific targets, which could accelerate the process of virtual screening and drug repositioning.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10179712 | PMC |
http://dx.doi.org/10.3390/ijms24098326 | DOI Listing |
Mol Inform
January 2025
Faculty of Information Technology, HUTECH University, 700000, Ho Chi Minh City, Vietnam.
In recent times, graph representation learning has been becoming a hot research topic which has attracted a lot of attention from researchers. Graph embeddings have diverse applications across fields such as information and social network analysis, bioinformatics and cheminformatics, natural language processing (NLP), and recommendation systems. Among the advanced deep learning (DL) based architectures used in graph representation learning, graph neural networks (GNNs) have emerged as the dominant and highly effective framework.
View Article and Find Full Text PDFFront Neurosci
November 2024
Global R&D Center, China FAW Corporation Limited, Changchun, China.
Brain-computer interfaces (BCIs) establish a direct communication pathway between the brain and external devices and have been widely applied in upper limb rehabilitation for hemiplegic patients. However, significant individual variability in motor imagery electroencephalogram (MI-EEG) signals leads to poor generalization performance of MI-based BCI decoding methods to new patients. This paper proposes a Multi-scale Frequency domain Feature-based Dynamic graph Attention Network (MFF-DANet) for upper limb MI decoding in hemiplegic patients.
View Article and Find Full Text PDFComput Biol Med
January 2025
School of Biological Science and Medical Engineering, Southeast University, Nanjing 210096, China. Electronic address:
Comput Biol Med
January 2025
Institute of Medical Information, Chinese Academy of Medical Sciences and Peking Union Medical College (CAMS&PUMC), Beijing, 100020, China. Electronic address:
Macromol Rapid Commun
December 2024
Department of Physics, University of Trento, via Sommarive, 14, Trento, I-38123, Italy.
The cowpea chlorotic mottle virus (CCMV) has emerged as a model system to assess the balance between electrostatic and topological features of single-stranded RNA viruses, specifically in the context of the viral self-assembly. Yet, despite its biophysical significance, little structural data on the RNA content of the CCMV virion is available. Here, the conformational dynamics of the RNA2 fragment of CCMV was assessed via coarse-grained molecular dynamics simulations, employing the oxRNA2 force field.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!