This talk will describe computational problems motivated by some approaches to statistical inference where the statistician tries to represent uncertainty about model parameters using probability distributions. In a classical Bayesian approach, some prior and some likelihood are specified, resulting in a posterior distribution on the parameters. I will consider three variants from the literature. The first is an approach originally developed by Arthur Dempster in the 1960s, which leads to a type of probabilistic inference on the parameters without specifying a prior distribution. The approach and new computational methods will be presented for the classical problem of inference for count data. The second is related to "modular inference" where the parameters are not inferred jointly given the entire data, but in a sequentially fashion, possibly to account for misspecification in parts of the model. The third consists of a combination of "bagging", where the data are repeatedly resampled with replacement, with Bayesian inference, possibly to account for sampling variability. For each of these approaches I will highlight the usefulness of recently proposed "unbiased" Monte Carlo methods based on couplings of Markov chains. This is joint work with various people including R. Gong, P. Edlefsen, A. Dempster, J. O'Leary and Y. Atchadé.
I am a tenure-track faculty in statistics at Harvard University, since 2015. Previously I was a postdoc at the University of Oxford and at the National University of Singapore. My Ph.D. was from Université Paris-Dauphine, on computational methods for Bayesian statistics. My research focus has been lately on Monte Carlo methods with couplings, and on model misspecification.