Jie Wang: Entropic Regularization for Wasserstein Distributionally Robust Optimization
Time: Tue 2026-05-05 13.15 - 14.15
Location: 3721 (Lindstedtsvägen 25)
Participating: Jie Wang (Chinese University of Hong Kong, Shenzhen)
Abstract: Wasserstein distributionally robust optimization sometimes encounters computation intractability. To tackle the computational challenge, we develop a novel approach that combines entropic regularization into the distributionally robust risk function. This regularization brings a notable improvement in computation compared with the original formulation. We develop efficient stochastic gradient methods with biased oracles to optimize the regularized objective, proving that our approach achieves near-optimal sample complexity. Furthermore, by leveraging state-of-the-art diffusion models, we develop a method to sample from the worst-case distribution. This technique achieves global convergence under mild assumptions by employing tools from bilevel optimization in the space of continuous probability densities. We numerically validate our proposed method in supervised learning, reinforcement learning, and contextual learning
