演讲者:贾骏雄(西安交通大学)
时间:2024-06-17 14:00-15:00
地点:腾讯会议ID: 993 769 354( 密码:0617 )
Abstract
The statistical inverse problems of partial differential equations (PDEs) can be seen as the PDE-constrained regression problem. From this perspective, we propose general generalization bounds for learning infinite-dimensionally defined prior measures in the style of the probability approximately correct Bayesian learning theory. The theoretical framework is rigorously defined on infinite-dimensional separable function space, which makes the theories intimately connected to the usual infinite-dimensional Bayesian inverse approach. Inspired by the concept of differential privacy, a generalized condition has been proposed, which allows the learned prior measures to depend on the measured data. After illustrating the general theories, the specific settings of linear and nonlinear problems have been given and can be easily casted into our general theories to obtain concrete generalization bounds. Based on the obtained generalization bounds, infinite-dimensionally well-defined practical algorithms are formulated.