Publications by authors named "Daniel Kifer"

Various machine learning (ML) and deep learning (DL) techniques have been recently applied to the forecasting of laboratory earthquakes from friction experiments. The magnitude and timing of shear failures in stick-slip cycles are predicted using features extracted from the recorded ultrasonic or acoustic emission (AE) signals. In addition, the Rate and State Friction (RSF) constitutive laws are extensively used to model the frictional behavior of faults.

View Article and Find Full Text PDF

The use of formal privacy to protect the confidentiality of responses in the 2020 Decennial Census of Population and Housing has triggered renewed interest and debate over how to measure the disclosure risks and societal benefits of the published data products. We argue that any proposal for quantifying disclosure risk should be based on prespecified, objective criteria. We illustrate this approach to evaluate the absolute disclosure risk framework, the counterfactual framework underlying differential privacy, and prior-to-posterior comparisons.

View Article and Find Full Text PDF

Predicting failure in solids has broad applications including earthquake prediction which remains an unattainable goal. However, recent machine learning work shows that laboratory earthquakes can be predicted using micro-failure events and temporal evolution of fault zone elastic properties. Remarkably, these results come from purely data-driven models trained with large datasets.

View Article and Find Full Text PDF

Neural generative models can be used to learn complex probability distributions from data, to sample from them, and to produce probability density estimates. We propose a computational framework for developing neural generative models inspired by the theory of predictive processing in the brain. According to predictive processing theory, the neurons in the brain form a hierarchy in which neurons in one level form expectations about sensory inputs from another level.

View Article and Find Full Text PDF

Accurate prediction of the CO plume migration and pressure is imperative for safe operation and economic management of carbon storage projects. Numerical reservoir simulations of CO flow could be used for this purpose allowing the operators and stakeholders to calculate the site response considering different operational scenarios and uncertainties in geological characterization. However, the computational toll of these high-fidelity simulations has motivated the recent development of data-driven models.

View Article and Find Full Text PDF

Research on communities and crime has predominantly focused on social conditions within an area or in its immediate proximity. However, a growing body of research shows that people often travel to areas away from home, contributing to connections between places. A few studies highlight the criminological implications of such connections, focusing on important but rare ties like co-offending or gang conflicts.

View Article and Find Full Text PDF

Temporal models based on recurrent neural networks have proven to be quite powerful in a wide variety of applications, including language modeling and speech processing. However, training these models often relies on backpropagation through time (BPTT), which entails unfolding the network over many time steps, making the process of conducting credit assignment considerably more challenging. Furthermore, the nature of backpropagation itself does not permit the use of nondifferentiable activation functions and is inherently sequential, making parallelization of the underlying training process difficult.

View Article and Find Full Text PDF

Crime is one of the most important social problems in the country, affecting public safety, children development, and adult socioeconomic status. Understanding what factors cause higher crime rate is critical for policy makers in their efforts to reduce crime and increase citizens' life quality. We tackle a fundamental problem in our paper: crime rate inference at the neighborhood level.

View Article and Find Full Text PDF

Many previous proposals for adversarial training of deep neural nets have included directly modifying the gradient, training on a mix of original and adversarial examples, using contractive penalties, and approximately optimizing constrained adversarial objective functions. In this article, we show that these proposals are actually all instances of optimizing a general, regularized objective we call DataGrad. Our proposed DataGrad framework, which can be viewed as a deep extension of the layerwise contractive autoencoder penalty, cleanly simplifies prior work and easily allows extensions such as adversarial training with multitask cues.

View Article and Find Full Text PDF