ɬ﷬

Event

Morgan Austern (Harvard University)

Friday, November 21, 2025 13:30to14:30
Burnside Hall Room 1104, 805 rue Sherbrooke Ouest, Montreal, QC, H3A 0B9, CA

Title: Efficient finite sample bounds via optimal transport and debiased ML without sample splitting”

Abstract:

Finite sample bounds are ubiquitous in statistics and machine learning, underpinning applications ranging from multi-armed bandit problems to early stopping rules. However, classical bounds are often overly conservative, leading to suboptimal algorithms. In the first part of this talk, I will propose a method for deriving sharper bounds by bridging the gap between asymptotic limit theorems and finite-sample concentration. We achieve this by exploiting recent advances in Optimal Transport, Stein method and information theory. The resulting bounds are efficient, strictly valid in the finite-sample regime, and significantly tighter than the state-of-the-art. I will demonstrate how these bounds lead to direct algorithmic improvements.

In the second part of this talk, we will study the generalized method of moments, a key method for inference in causal inference. A recent line of work has shown how sample splitting in debiased machine learning enables the use of generic machine learning estimators to estimate nuisance parameters while maintaining the asymptotic normality and root-n consistency of the target parameter. We show that when these auxiliary estimation algorithms satisfy natural leave-one-out stability properties, then sample splitting is not required. This allows for sample re-use, which can be beneficial in moderately sized sample regimes.

🔗 Zoom:
Meeting ID: 817 3405 8047

Follow us on

Back to top