Publications by authors named "Fratello M"

Objective: Non-invasive neuromodulation techniques, particularly transcranial direct current stimulation (tDCS), are promising for drug-resistant epilepsy (DRE), though the mechanisms of their efficacy remain unclear. This study aims to (i) investigate tDCS neurophysiological mechanisms using a personalized multichannel protocol with magnetoencephalography (MEG) and (ii) assess post-tDCS changes in brain connectivity, correlating them with clinical outcomes.

Methods: Seventeen patients with focal DRE underwent three cycles of tDCS over five days, each consisting of 40-minute stimulations targeting the epileptogenic zone (EZ) identified via stereo-EEG.

View Article and Find Full Text PDF

Hazard assessment is the first step in evaluating the potential adverse effects of chemicals. Traditionally, toxicological assessment has focused on the exposure, overlooking the impact of the exposed system on the observed toxicity. However, systems toxicology emphasizes how system properties significantly contribute to the observed response.

View Article and Find Full Text PDF

The categorization of human diseases is mainly based on the affected organ system and phenotypic characteristics. This is limiting the view to the pathological manifestations, while it neglects mechanistic relationships that are crucial to develop therapeutic strategies. This work aims to advance the understanding of diseases and their relatedness beyond traditional phenotypic views.

View Article and Find Full Text PDF

Motivation: De novo drug development is a long and expensive process that poses significant challenges from the design to the preclinical testing, making the introduction into the market slow and difficult. This limitation paved the way to the development of drug repurposing, which consists in the re-usage of already approved drugs, developed for other therapeutic indications. Although several efforts have been carried out in the last decade in order to achieve clinically relevant drug repurposing predictions, the amount of repurposed drugs that have been employed in actual pharmacological therapies is still limited.

View Article and Find Full Text PDF

Adverse outcome pathways (AOPs) are emerging as a central framework in modern toxicology and other fields in biomedicine. They serve as an extension of pathway-based concepts by depicting biological mechanisms as causally linked sequences of key events (KEs) from a molecular initiating event (MIE) to an adverse outcome. AOPs guide the use and development of new approach methodologies (NAMs) aimed at reducing animal experimentation.

View Article and Find Full Text PDF

Summary: Biological data repositories are an invaluable source of publicly available research evidence. Unfortunately, the lack of convergence of the scientific community on a common metadata annotation strategy has resulted in large amounts of data with low FAIRness (Findable, Accessible, Interoperable and Reusable). The possibility of generating high-quality insights from their integration relies on data curation, which is typically an error-prone process while also being expensive in terms of time and human labour.

View Article and Find Full Text PDF

Motivation: Transcriptomic data can be used to describe the mechanism of action (MOA) of a chemical compound. However, omics data tend to be complex and prone to noise, making the comparison of different datasets challenging. Often, transcriptomic profiles are compared at the level of individual gene expression values, or sets of differentially expressed genes.

View Article and Find Full Text PDF

The varied transcriptomic response to nanoparticles has hampered the understanding of the mechanism of action. Here, by performing a meta-analysis of a large collection of transcriptomics data from various engineered nanoparticle exposure studies, we identify common patterns of gene regulation that impact the transcriptomic response. Analysis identifies deregulation of immune functions as a prominent response across different exposure studies.

View Article and Find Full Text PDF

Mechanistic toxicology provides a powerful approach to inform on the safety of chemicals and the development of safe-by-design compounds. Although toxicogenomics supports mechanistic evaluation of chemical exposures, its implementation into the regulatory framework is hindered by uncertainties in the analysis and interpretation of such data. The use of mechanistic evidence through the adverse outcome pathway (AOP) concept is promoted for the development of new approach methodologies (NAMs) that can reduce animal experimentation.

View Article and Find Full Text PDF

There is an urgent need to apply effective, data-driven approaches to reliably predict engineered nanomaterial (ENM) toxicity. Here we introduce a predictive computational framework based on the molecular and phenotypic effects of a large panel of ENMs across multiple in vitro and in vivo models. Our methodology allows for the grouping of ENMs based on multi-omics approaches combined with robust toxicity tests.

View Article and Find Full Text PDF

Despite remarkable efforts of computational and predictive pharmacology to improve therapeutic strategies for complex diseases, only in a few cases have the predictions been eventually employed in the clinics. One of the reasons behind this drawback is that current predictive approaches are based only on the integration of molecular perturbation of a certain disease with drug sensitivity signatures, neglecting intrinsic properties of the drugs. Here we integrate mechanistic and chemocentric approaches to drug repositioning by developing an innovative network pharmacology strategy.

View Article and Find Full Text PDF

The recent advancements in toxicogenomics have led to the availability of large omics data sets, representing the starting point for studying the exposure mechanism of action and identifying candidate biomarkers for toxicity prediction. The current lack of standard methods in data generation and analysis hampers the full exploitation of toxicogenomics-based evidence in regulatory risk assessment. Moreover, the pipelines for the preprocessing and downstream analyses of toxicogenomic data sets can be quite challenging to implement.

View Article and Find Full Text PDF

The pharmacological arsenal against the COVID-19 pandemic is largely based on generic anti-inflammatory strategies or poorly scalable solutions. Moreover, as the ongoing vaccination campaign is rolling slower than wished, affordable and effective therapeutics are needed. To this end, there is increasing attention toward computational methods for drug repositioning and de novo drug design.

View Article and Find Full Text PDF

The amount of data made available by microarrays gives researchers the opportunity to delve into the complexity of biological systems. However, the noisy and extremely high-dimensional nature of this kind of data poses significant challenges. Microarrays allow for the parallel measurement of thousands of molecular objects spanning different layers of interactions.

View Article and Find Full Text PDF

Biomarkers are valuable indicators of the state of a biological system. Microarray technology has been extensively used to identify biomarkers and build computational predictive models for disease prognosis, drug sensitivity and toxicity evaluations. Activation biomarkers can be used to understand the underlying signaling cascades, mechanisms of action and biological cross talk.

View Article and Find Full Text PDF
Article Synopsis
  • * Artificial intelligence and machine learning, particularly deep reinforcement learning, are revolutionizing this process by utilizing advanced neural networks to generate innovative drug candidates.
  • * The article reviews the evolution of de novo drug design techniques and suggests future areas for research and improvement in the field.
View Article and Find Full Text PDF

Background: Omics technologies have been widely applied in toxicology studies to investigate the effects of different substances on exposed biological systems. A classical toxicogenomic study consists in testing the effects of a compound at different dose levels and different time points. The main challenge consists in identifying the gene alteration patterns that are correlated to doses and time points.

View Article and Find Full Text PDF

Preprocessing of transcriptomics data plays a pivotal role in the development of toxicogenomics-driven tools for chemical toxicity assessment. The generation and exploitation of large volumes of molecular profiles, following an appropriate experimental design, allows the employment of toxicogenomics (TGx) approaches for a thorough characterisation of the mechanism of action (MOA) of different compounds. To date, a plethora of data preprocessing methodologies have been suggested.

View Article and Find Full Text PDF

The starting point of successful hazard assessment is the generation of unbiased and trustworthy data. Conventional toxicity testing deals with extensive observations of phenotypic endpoints in vivo and complementing in vitro models. The increasing development of novel materials and chemical compounds dictates the need for a better understanding of the molecular changes occurring in exposed biological systems.

View Article and Find Full Text PDF

Transcriptomics data are relevant to address a number of challenges in Toxicogenomics (TGx). After careful planning of exposure conditions and data preprocessing, the TGx data can be used in predictive toxicology, where more advanced modelling techniques are applied. The large volume of molecular profiles produced by omics-based technologies allows the development and application of artificial intelligence (AI) methods in TGx.

View Article and Find Full Text PDF

Motivation: The analysis of dose-dependent effects on the gene expression is gaining attention in the field of toxicogenomics. Currently available computational methods are usually limited to specific omics platforms or biological annotations and are able to analyse only one experiment at a time.

Results: We developed the software BMDx with a graphical user interface for the Benchmark Dose (BMD) analysis of transcriptomics data.

View Article and Find Full Text PDF

Magnetic resonance imaging allows acquiring functional and structural connectivity data from which high-density whole-brain networks can be derived to carry out connectome-wide analyses in normal and clinical populations. Graph theory has been widely applied to investigate the modular structure of brain connections by using centrality measures to identify the "hub" of human connectomes, and community detection methods to delineate subnetworks associated with diverse cognitive and sensorimotor functions. These analyses typically rely on a preprocessing step (pruning) to reduce computational complexity and remove the weakest edges that are most likely affected by experimental noise.

View Article and Find Full Text PDF

The main challenge in analysing functional magnetic resonance imaging (fMRI) data from extended samples of subject (N > 100) is to extract as much relevant information as possible from big amounts of noisy data. When studying neurodegenerative diseases with resting-state fMRI, one of the objectives is to determine regions with abnormal background activity with respect to a healthy brain and this is often attained with comparative statistical models applied to single voxels or brain parcels within one or several functional networks. In this work, we propose a novel approach based on clustering and stochastic rank aggregation to identify parcels that exhibit a coherent behaviour in groups of subjects affected by the same disorder and apply it to default-mode network independent component maps from resting-state fMRI data sets.

View Article and Find Full Text PDF

Background: Tasting is a complex process involving chemosensory perception and cognitive evaluation. Different experimental designs and solution delivery approaches may in part explain the variability reported in literature. These technical aspects certainly limit the development of taste-related brain computer interface devices.

View Article and Find Full Text PDF

Purpose: Advances in computational network analysis have enabled the characterization of topological properties of human brain networks (connectomics) from high angular resolution diffusion imaging (HARDI) MRI structural measurements. In this study, the effect of changing the diffusion weighting (b value) and sampling (number of gradient directions) was investigated in ten healthy volunteers, with specific focus on graph theoretical network metrics used to characterize the human connectome.

Methods: Probabilistic tractography based on the Q-ball reconstruction of HARDI MRI measurements was performed and structural connections between all pairs of regions from the automated anatomical labeling (AAL) atlas were estimated, to compare two HARDI schemes: low b value (b = 1000) and low direction number (n = 32) (LBLD); high b value (b = 3000) and high number (n = 54) of directions (HBHD).

View Article and Find Full Text PDF