Proc Natl Acad Sci U S A
September 2024
The paper is concerned with inference for a parameter of interest in models that share a common interpretation for that parameter but that may differ appreciably in other respects. We study the general structure of models under which the maximum likelihood estimator of the parameter of interest is consistent under arbitrary misspecification of the nuisance part of the model. A specialization of the general results to matched-comparison and two-groups problems gives a more explicit and easily checkable condition in terms of a notion of symmetric parameterization, leading to a broadening and unification of existing results in those problems.
View Article and Find Full Text PDFObjective: In this study, we investigate the seizure outcomes of temporo-parieto-occipital (TPO) and frontal disconnections or resections in children with drug-resistant epilepsy (DRE) in order to determine factors which may predict surgical results.
Methods: Children with DRE, who underwent either TPO or frontal disconnection or resection at Great Ormond Street Hospital for Children between 2000 and 2017, were identified from a prospectively collated operative database. Demographic data, age at surgery, type of surgery, scalp EEGs and operative histopathology were collected.
If an artificial intelligence aims to maximize risk-adjusted return, then under mild conditions it is disproportionately likely to pick an unethical strategy unless the objective function allows sufficiently for this risk. Even if the proportion of available unethical strategies is small, the probability of picking an unethical strategy can become large; indeed, unless returns are fat-tailed tends to unity as the strategy space becomes large. We define an unethical odds ratio, (capital upsilon), that allows us to calculate from , and we derive a simple formula for the limit of as the strategy space becomes large.
View Article and Find Full Text PDFHigh-dimensional data are often most plausibly generated from distributions with complex structure and leptokurtosis in some or all components. Covariance and precision matrices provide a useful summary of such structure, yet the performance of popular matrix estimators typically hinges upon a sub-Gaussianity assumption. This paper presents robust matrix estimators whose performance is guaranteed for a much richer class of distributions.
View Article and Find Full Text PDFThis paper studies hypothesis testing and parameter estimation in the context of the divide-and-conquer algorithm. In a unified likelihood based framework, we propose new test statistics and point estimators obtained by aggregating various statistics from subsamples of size , where is the sample size. In both low dimensional and sparse high dimensional settings, we address the important question of how large can be, as grows large, such that the loss of efficiency due to the divide-and-conquer algorithm is negligible.
View Article and Find Full Text PDFThe global financial crisis of 2007-2009 exposed critical weaknesses in the financial system. Many proposals for financial reform address the need for systemic regulation--that is, regulation focused on the soundness of the whole financial system and not just that of individual institutions. In this paper, we study one particular problem faced by a systemic regulator: the tension between the distribution of assets that individual banks would like to hold and the distribution across banks that best supports system stability if greater weight is given to avoiding multiple bank failures.
View Article and Find Full Text PDF