A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.





Contextual Point Matching for Video Stabilization

Published in SIURO, Volume 6, 2013

This work explores the use of contextual shape descriptors for the purposes of video stabilization. We apply it to a particular pathological video, consisting of a sketch of a sea shell which contains a high degree of self-similarity at various scales.

Recommended citation: Meng, Ling Han, Joseph Geumlek, Holly Chu, Justin Hoogenstyrd, and Ernie Esser (2013). "Contextual point matching for video stabilization." SIAM Undergraduate Research Online. Volume 6.

On the Theory and Practice of Privacy-preserving Bayesian Data Analysis

Published in Conference on Uncertainty in Artificial Intelligence, 2016

This paper explores how posterior sampling, a popular means for Bayesian data analysis, can be performed privately. In particular, it explores the theoretical benefits achieved by adding noise to the sufficient statistics of exponential family models, compared to other approaches that do not leverage these distributional traits. The theory explores the asymptotic relative efficiency of these samples a useful categorization of a mechanisms behavior as data sizes grow.

Recommended citation: Foulds, James, Joseph Geumlek, Max Welling, and Kamalika Chaudhuri (2016). "On the theory and practice of privacy-preserving Bayesian data analysis." Conference on Uncertainty in Artificial Intelligence.

Rényi Differential Privacy Mechanisms for Posterior Sampling

Published in Advances in Neural Information Processing Systems, 2017

Following from previous work, here we further examine privatized methods for releasing samples from exponential family distributions. Compared to the previous work, here we use Rényi Differential Privacy as the privacy framework. Among other things, this framework permits us to use the stabilizing effect of a prior distribution to increase the privacy guarantees. This work explores how you can achieve privacy either through weakening the Bayesian update of the observations, or through strengthening the prior distribution.

Recommended citation: Geumlek, Joseph, Shuang Song, and Kamalika Chaudhuri (2017). "Rényi differential privacy mechanisms for posterior sampling." Advances in Neural Information Processing Systems.

Profile-based Privacy for Locally Private Computations

Published in IEEE International Symposium on Information Theory, 2019

We present an alternative privacy framework, focusing on obscuring the identity of specific generating distributions, rather than on specific observed values. This change has benefits and drawbacks, explored by this work. We present some basic mechanisms that can achieve these guarantees while also exploiting the utility gains offered by changing the privacy framework. More complicated mechanisms are left as future work.

Recommended citation: Geumlek, Joseph, and Kamalika Chaudhuri (2019). "Profile-based privacy for locally private computations." IEEE International Symposium on Information Theory

Privacy Amplification by Mixing and Diffusion Mechanisms

Published in Advances in Neural Information Processing Systems, 2019

This work studies iterative data analysis using the mathematical theory of diffusion and the mixing of Markov chains. In doing so, the privacy guarantees of the uncertainty added in each iteration can be compounded. It provides a cleaner theoretical framework for the analysis of iterated Noisy Stochastic Gradient Descent, recovering the privacy guarantees already known for that setting, while also showing the flexibility of this new analysis.

Recommended citation: Balle, Borja, Gilles Barthe, Marco Gaboardi, and Joseph Geumlek (2019). "Privacy amplification by mixing and diffusion mechanisms." Advances in Neural Information Processing Systems,

Sampling from Distributions under Differential Privacy Notions

Published in UC San Diego (eScholarship), 2020

This is my doctoral thesis, synthesizing all of my work completed during my PhD program at UCSD. It explores privacy notions for releasing samples and various methods for achieving them, ranging from privatizing sufficient statistics, changing the posterior update rule, combining soruces of uncertainty through the framework of diffusions, and hiding generating distributions through a convex optimization task.

Recommended citation: Geumlek, J. D. (2020). Sampling from Distributions under Differential Privacy Notions. UC San Diego. ProQuest ID: Geumlek_ucsd_0033D_19410. Merritt ID: ark:/13030/m5p03hfr. Retrieved from