Several groups of neurons in the NTS suppress food intake, including Prlh-expressing neurons (NTS cells). Not only does the artificial activation of NTS cells decrease feeding, but also the expression of Prlh (which encodes the neuropeptide PrRP) and neurotransmission by NTS neurons contributes to the restraint of food intake and body weight, especially in animals fed a high fat diet (HFD). We used animals lacking PrRP receptors GPR10 and/or GRP74 (encoded by Prlhr and Npffr2, respectively) to determine roles for each in the restraint of food intake and body weight by the increased expression of Prlh in NTS neurons (NTS mice) and in response to the anorectic PrRP analog, p52.
View Article and Find Full Text PDFBariatric surgery is effective for the treatment and remission of obesity and type 2 diabetes, but pharmacological approaches which exert similar metabolic adaptations are needed to avoid post-surgical complications. Here we show how G49, an oxyntomodulin (OXM) analog and dual glucagon/glucagon-like peptide-1 receptor (GCGR/GLP-1R) agonist, triggers an inter-organ crosstalk between adipose tissue, pancreas, and liver which is initiated by a rapid release of free fatty acids (FFAs) by white adipose tissue (WAT) in a GCGR-dependent manner. This interactome leads to elevations in adiponectin and fibroblast growth factor 21 (FGF21), causing WAT beiging, brown adipose tissue (BAT) activation, increased energy expenditure (EE) and weight loss.
View Article and Find Full Text PDFSeveral peptide dual agonists of the human glucagon receptor (GCGR) and the glucagon-like peptide-1 receptor (GLP-1R) are in development for the treatment of type 2 diabetes, obesity and their associated complications. Candidates must have high potency at both receptors, but it is unclear whether the limited experimental data available can be used to train models that accurately predict the activity at both receptors of new peptide variants. Here we use peptide sequence data labelled with in vitro potency at human GCGR and GLP-1R to train several models, including a deep multi-task neural-network model using multiple loss optimization.
View Article and Find Full Text PDF