Publications by authors named "Benjamin Nachman"

Detector simulation and reconstruction are a significant computational bottleneck in particle physics. We develop particle-flow neural-assisted simulations (parnassus) to address this challenge. Our deep learning model takes as input a point cloud (particles impinging on a detector) and produces a point cloud (reconstructed particles).

View Article and Find Full Text PDF

Machine learning-based anomaly detection (AD) methods are promising tools for extending the coverage of searches for physics beyond the Standard Model (BSM). One class of AD methods that has received significant attention is resonant anomaly detection, where the BSM physics is assumed to be localized in at least one known variable. While there have been many methods proposed to identify such a BSM signal that make use of simulated or detected data in different ways, there has not yet been a study of the methods' complementarity.

View Article and Find Full Text PDF

We explore the use of Quantum Machine Learning (QML) for anomaly detection at the Large Hadron Collider (LHC). In particular, we explore a semi-supervised approach in the four-lepton final state where simulations are reliable enough for a direct background prediction. This is a representative task where classification needs to be performed using small training datasets - a regime that has been suggested for a quantum advantage.

View Article and Find Full Text PDF

Calibration is a common experimental physics problem, whose goal is to infer the value and uncertainty of an unobservable quantity Z given a measured quantity X. Additionally, one would like to quantify the extent to which X and Z are correlated. In this Letter, we present a machine learning framework for performing frequentist maximum likelihood inference with Gaussian uncertainty estimation, which also quantifies the mutual information between the unobservable and measured quantities.

View Article and Find Full Text PDF

A significant problem for current quantum computers is noise. While there are many distinct noise channels, the depolarizing noise model often appropriately describes average noise for large circuits involving many qubits and gates. We present a method to mitigate the depolarizing noise by first estimating its rate with a noise-estimation circuit and then correcting the output of the target circuit using the estimated rate.

View Article and Find Full Text PDF

Simulating the full dynamics of a quantum field theory over a wide range of energies requires exceptionally large quantum computing resources. Yet for many observables in particle physics, perturbative techniques are sufficient to accurately model all but a constrained range of energies within the validity of the theory. We demonstrate that effective field theories (EFTs) provide an efficient mechanism to separate the high energy dynamics that is easily calculated by traditional perturbation theory from the dynamics at low energy and show how quantum algorithms can be used to simulate the dynamics of the low energy EFT from first principles.

View Article and Find Full Text PDF

A new paradigm for data-driven, model-agnostic new physics searches at colliders is emerging, and aims to leverage recent breakthroughs in anomaly detection and machine learning. In order to develop and benchmark new anomaly detection methods within this framework, it is essential to have standard datasets. To this end, we have created the LHC Olympics 2020, a community challenge accompanied by a set of simulated collider events.

View Article and Find Full Text PDF

Memory enables access to past experiences to guide future behavior. Humans can determine which memories to trust (high confidence) and which to doubt (low confidence). How memory retrieval, memory confidence, and memory-guided decisions are related, however, is not understood.

View Article and Find Full Text PDF

Simulating quantum field theories is a flagship application of quantum computing. However, calculating experimentally relevant high energy scattering amplitudes entirely on a quantum computer is prohibitively difficult. It is well known that such high energy scattering processes can be factored into pieces that can be computed using well established perturbative techniques, and pieces which currently have to be simulated using classical Markov chain algorithms.

View Article and Find Full Text PDF

Collider data must be corrected for detector effects ("unfolded") to be compared with many theoretical calculations and measurements from other experiments. Unfolding is traditionally done for individual, binned observables without including all information relevant for characterizing the detector response. We introduce OmniFold, an unfolding method that iteratively reweights a simulated dataset, using machine learning to capitalize on all available information.

View Article and Find Full Text PDF

In the context of the standard model of particle physics, the relationship between the top-quark mass and width (Γ_{t}) has been precisely calculated. However, the uncertainty from current direct measurements of the width is nearly 50%. A new approach for directly measuring the top-quark width using events away from the resonance peak is presented.

View Article and Find Full Text PDF

Despite extensive theoretical motivation for physics beyond the standard model (BSM) of particle physics, searches at the Large Hadron Collider have found no significant evidence for BSM physics. Therefore, it is essential to broaden the sensitivity of the search program to include unexpected scenarios. We present a new model-agnostic anomaly detection technique that naturally benefits from modern machine learning algorithms.

View Article and Find Full Text PDF

Physicists at the Large Hadron Collider (LHC) rely on detailed simulations of particle collisions to build expectations of what experimental data may look like under different theoretical modeling assumptions. Petabytes of simulated data are needed to develop analysis techniques, though they are expensive to generate using existing algorithms and computing resources. The modeling of detectors and the precise description of particle cascades as they interact with the material in the calorimeter are the most computationally demanding steps in the simulation pipeline.

View Article and Find Full Text PDF