Biorelevant dissolution tests of oral solid dosage forms open the gate to valid in vitro-in vivo predictions (IVIVP). A recently developed apparatus, PhysioCell, allows mimicking the fluid flow and pressure waves occurring in the human fasted stomach. In this work, we used the PhysioCell to perform IVIVP for vortioxetine immediate-release (IR) tablets: the originator (Brintellix) and generic product candidates (VORTIO).
View Article and Find Full Text PDFComput Methods Programs Biomed
December 2019
Background And Objective: People suffer from sleep disorders caused by work-related stress, irregular lifestyle or mental health problems. Therefore, development of effective tools to diagnose sleep disorders is important. Recently, to analyze biomedical signals Information Theory is exploited.
View Article and Find Full Text PDFTo understand how anatomy and physiology allow an organism to perform its function, it is important to know how information that is transmitted by spikes in the brain is received and encoded. A natural question is whether the spike rate alone encodes the information about a stimulus (rate code), or additional information is contained in the temporal pattern of the spikes (temporal code). Here we address this question using data from the cat Lateral Geniculate Nucleus (LGN), which is the visual portion of the thalamus, through which visual information from the retina is communicated to the visual cortex.
View Article and Find Full Text PDFThe nature of neural codes is central to neuroscience. Do neurons encode information through relatively slow changes in the firing rates of individual spikes (rate code) or by the precise timing of every spike (temporal code)? Here we compare the loss of information due to correlations for these two possible neural codes. The essence of Shannon's definition of information is to combine information with uncertainty: the higher the uncertainty of a given event, the more information is conveyed by that event.
View Article and Find Full Text PDFBackground: Explaining how the brain processing is so fast remains an open problem (van Hemmen JL, Sejnowski T., 2004). Thus, the analysis of neural transmission (Shannon CE, Weaver W.
View Article and Find Full Text PDFOrganisms often evolve as compromises, and many of these compromises can be expressed in terms of energy efficiency. Thus, many authors analyze energetic costs processes during information transmission in the brain. In this paper we study information transmission rate per energy used in a class of ring, brain inspired neural networks, which we assume to involve components like excitatory and inhibitory neurons or long-range connections.
View Article and Find Full Text PDFEven in the absence of external stimuli there is ongoing activity in the cerebral cortex as a result of recurrent connectivity. This paper attempts to characterize one aspect of this ongoing activity by examining how the information content carried by specific neurons varies as a function of brain state. We recorded from rats chronically implanted with tetrodes in the primary visual cortex during awake and sleep periods.
View Article and Find Full Text PDFThere has been a growing interest in the estimation of information carried by a single neuron and multiple single units or population of neurons to specific stimuli. In this paper we analyze, inspired by article of Levy and Baxter (2002), the efficiency of a neuronal communication by considering dendrosomatic summation as a Shannon-type channel (1948) and by considering such uncertain synaptic transmission as part of the dendrosomatic computation. Specifically, we study Mutual Information between input and output signals for different types of neuronal network architectures by applying efficient entropy estimators.
View Article and Find Full Text PDFWe propose a definition of finite-space Lyapunov exponent. For discrete-time dynamical systems, it measures the local (between neighboring points) average spreading of the system. We justify our definition by showing that, for large classes of chaotic maps, the corresponding finite-space Lyapunov exponent approaches the Lyapunov exponent of a chaotic map when M-->infinity, where M is the cardinality of the discrete phase space.
View Article and Find Full Text PDFNormalized Lempel-Ziv complexity, which measures the generation rate of new patterns along a digital sequence, is closely related to such important source properties as entropy and compression ratio, but, in contrast to these, it is a property of individual sequences. In this article, we propose to exploit this concept to estimate (or, at least, to bound from below) the entropy of neural discharges (spike trains). The main advantages of this method include fast convergence of the estimator (as supported by numerical simulation) and the fact that there is no need to know the probability law of the process generating the signal.
View Article and Find Full Text PDFWe apply the random field theory tothe study of DNA chains which we assume tobe trajectories of a stochastic process. Weconstruct statistical potential betweennucleotides corresponding to theprobabilities of those trajectories thatcan be obtained from the DNA data basecontaining millions of sequences. It turnsout that this potential has aninterpretation in terms of quantitiesnaturally arrived at during the study ofevolution of species i.
View Article and Find Full Text PDF