Past

[Math Department Invited Talk] Continuous Optimization for Directed Acyclic Graph (DAG) Learning

Abstract

Learning a DAG from observed samples of an unknown joint distribution is generally a challenging combinatorial problem, owing to the intractable search space superexponential in the number of graph nodes. A recent breakthrough formulates the problem as a continuous optimization with a structural constraint that ensures acyclicity (NOTEARS, Zheng et al., 2018), which enables a suite of continuous optimization techniques to be used and employs an augmented Lagrangian method to apply the constraint.


However, in large graphs NOTEARS can generally take many iterations for the augmented Lagrangian method to converge. In this talk, we propose new continuous optimization algorithms aiming to improve NOTEARS on both efficiency and accuracy. We first show that the Karush-Kuhn-Tucker (KKT) optimality conditions for the NOTEARS formulation cannot be satisfied except in a trivial case, which explains a behavior of the slow convergence. We then derive the KKT conditions for an equivalent reformulation, show that they are indeed necessary, and relate them to explicit constraints that certain edges be absent from the graph. Informed by the KKT conditions, a local search post-processing algorithm is proposed and shown to substantially and universally improve the learning accuracy, typically by a factor of 2 or more. Lastly, we consider a reformulation of the DAG space, and propose a new framework for DAG structure learning by searching in this equivalent set of DAGs. A fast projection method is developed based on this continuous optimization approach without constraint. Experimental studies on benchmark datasets demonstrate that our method provides comparable accuracy but better efficiency, often by more than one order of magnitude.




Biography

Dr. Yue Yu received her B.S. from Peking University in 2008, and her Ph.D. from Brown University in 2014. She was a postdoc fellow at Harvard University after graduation, and then she joined Lehigh University as an assistant professor of applied mathematics and was promoted to associate professor in 2019. Dr. Yu’s research lies in the area of applied and computational mathematics, with recent projects focusing on nonlocal problems and machine learning. She was a recipient of  the NSF-DMS early career award in 2018.