全部

Unifying Non-Convex Low-Rank Matrix Recovery Algorithms by Riemannian Gradient Descent

  • 演讲者:蔡剑锋(香港科技大学)

  • 时间:2022-08-24 15:00-16:00

  • 地点:腾讯会议 ID 729-760-405,密码 220824

Abstract 

The problem of low-rank matrix recovery from linear samples arises from numerous practical applications in machine learning, imaging, signal processing, computer vision, etc. Non-convex algorithms are usually very efficient and effective for low-rank matrix recovery with a theoretical guarantee, despite of possible local minima. In this talk, non-convex low-rank matrix recovery algorithms are unified under the framework of Riemannian gradient descent. We show that many popular non-convex low-rank matrix recovery algorithms are special cases of Riemannian gradient descent with different Riemannian metrics and retraction operators. Moreover, we identify the best choice of metrics and construct the most efficient non-convex algorithms for low-rank matrix recovery, by considering properties of sampling operators for different tasks such as matrix completion and phase retrieval.


Short bio 
蔡剑锋,香港科技大学数学系教授。2000 年获复旦大学学士学位,2007 年获香港中文大学博士学位。曾先后在新加坡国立大学,美国洛杉矶加州大学,和美国爱荷华大学工作。研究兴趣是数据科学和成像技术中的算法设计和分析。在 2017年和2018 年被评选为全球高被引学者。