All

Generative Copula Modeling with Neural Networks

  • Speaker: Marius Hofert(University of Hong Kong)

  • Time: May 7, 2026, 15:00-16:00

  • Location: Room M714, College of Science Building

Abstract
Generative moment matching networks (GMMNs) are introduced as dependence models with two particularly important applications in mind. First, for generating approximate quasi-random samples from multivariate models with any underlying copula in order to compute estimates under variance reduction. Once trained on pseudo-random samples from a parametric model or on real data, GMMNs only require a multivariate standard uniform randomized QMC point set as input and are thus fast in estimating expectations of interest under dependence with variance reduction. Second, GMMNs can learn maps from d-dimensional samples with any underlying dependence structure to multivariate uniformity in d' dimensions,which can be used for dependence model assessment and selection. Besides a numerical assessment, particularly of interest is the case d' = 2, which allows for a graphical assessment and selection approach. A distinct feature of this approach is to allow one to identify regions of the domain in which a candidate model does not provide an adequate fit. Both applications work well in dimensions up to around 20 with a simple underlying GMMN architecture, but deteriorate thereafter. An enhance training procedure based on adaptively selecting bandwidths of the mixture kernel of the underlying maximum mean discrepancy loss allows to obtain an improved learning of copulas in dimensions as large as 100 for the first time, with moderate increase in run time being offset by an early stopping criterion.