发布时间: 2016-11-03 15:34:00
题 目： Scalable Gaussian Process Analysis and Nonparametric Learning
报告人：Jie Chen （陈捷）研究员 IBM Thomas J. Watson Research Center
时 间： 2016-11-03 10:30--11:30
地 点： 学院401报告厅
Jie Chen is a Research Staff Member at the IBM Thomas J. Watson Research Center. Centrally themed at matrices, his interests span a variety of areas, including numerical linear algebra, scientific computing, parallel processing, applied statistics, machine learning, and artificial intelligence. His work is published in various leading journals in applied mathematics and computer science, including a paper awarded the Student Paper Prize by the Society for Industrial and Applied Mathematics. Jie received his undergraduate degree in Mathematics at Zhejiang University and Ph.D. in Computer Science at the University of Minnesota. He worked at Argonne National Laboratory before joining IBM. More information can be found in his personal homepage http://jie-chen-ibm.appspot.com/
Gaussian processes are the cornerstone of statistical analysis, broadly used in disciplines such as scientific computing and machine learning. Example applications include quantifying the simulation uncertainty produced by stochastic inputs, designing effective computer experiments with a large number of parameters, and recognizing patterns in speech, image, and text data. The oretically grounded, Gaussian processes incur a significant challenge in computations, because the arithmetic costs are generally cubic in the data size. In this talk, I present several lines of work that addresses the challenge facing large-scale data, for tasks such as sampling, prediction, and parameter estimation. These efforts hint on a quest for unifying treatments of kernel matrices that are fully dense but structured. I will conclude the talk by presenting an approach that establishes a linear-complexity framework for handling these matrices, particularly in the high dimensional setting that is currently a major hurdle.