Publications by authors named "Abdullah Makkeh"

Describing statistical dependencies is foundational to empirical scientific research. For uncovering intricate and possibly nonlinear dependencies between a single target variable and several source variables within a system, a principled and versatile framework can be found in the theory of partial information decomposition (PID). Nevertheless, the majority of existing PID measures are restricted to categorical variables, while many systems of interest in science are continuous.

View Article and Find Full Text PDF
Article Synopsis
  • The integration of information from various sources to multiple targets is fundamental to neural systems, with partial information decomposition (PID) providing insights into mutual information contributions.
  • Despite progress in studying PID for Gaussian and discrete distributions, this work introduces a method to estimate unique information in general continuous distributions involving one and two variables.
  • By employing copula decompositions and variational autoencoder techniques, the method achieves strong results with Gaussian distributions and demonstrates effectiveness in neural models, revealing intricate relationships between redundancy, synergy, and unique information in networked neurons.
View Article and Find Full Text PDF

Intuitively, the level of autonomy of an agent is related to the degree to which the agent's goals and behaviour are decoupled from the immediate control by the environment. Here, we capitalise on a recent information-theoretic formulation of autonomy and introduce an algorithm for calculating autonomy in a limiting process of time step approaching infinity. We tackle the question of how the autonomy level of an agent changes during training.

View Article and Find Full Text PDF

Partial information decomposition of the multivariate mutual information describes the distinct ways in which a set of source variables contains information about a target variable. The groundbreaking work of Williams and Beer has shown that this decomposition cannot be determined from classic information theory without making additional assumptions, and several candidate measures have been proposed, often drawing on principles from related fields such as decision theory. None of these measures is differentiable with respect to the underlying probability mass function.

View Article and Find Full Text PDF

Makkeh, Theis, and Vicente found that Cone Programming model is the most robust to compute the Bertschinger et al. partial information decomposition (BROJA PID) measure. We developed a production-quality robust software that computes the BROJA PID measure based on the Cone Programming model.

View Article and Find Full Text PDF