Abstract: From data collection to model building and to computation, statistical inference at every stage must reconcile with imperfections. I discuss a serendipitous result that two apparently imperfect components mingle to produce the ``perfect’’ inference. Differentially private data protect individuals' confidential information by subjecting themselves to carefully designed noise mechanisms, trading off statistical efficiency for privacy. Approximate Bayesian computation (ABC) allows for sampling from approximate posteriors of complex models with intractable likelihoods, trading off exactness for computational efficiency. Finding the right alignment between the two tradeoffs liberates one from the other, and salvages the exactness of inference in the process. A parallel result for maximum likelihood inference on private data using Monte Carlo Expectation-Maximization is also discussed. The paper on which this talk is based can be found at this page.
Ruobin Gong is Assistant Professor of Statistics at Rutgers University. Ruobin received her Ph.D. in Statistics from Harvard University, advised by Xiao-Li Meng and Arthur P. Dempster. Prior to that, she graduated from the University of Toronto with an Honours B. Sc. in cognitive psychology. Ruobin's research interests lie at the foundations of uncertainty reasoning, Bayesian and generalized Bayesian Methodology and computation, random sets, imprecise probability, and Dempster-Shafer theory of belief function, as well as applications to robustness and privacy statistics.