Pengqi Liu (平特五不中)
Title: Regularization in Finite Mixture of Sparse GLMs with Ultra-High Dimensionality and Convergence of EM Algorithm
Abstract: Finite mixture of generalized linear regression models (FM-GLM) are used for analyzing data that arise from populations with unobserved heterogeneity. In recent applications of FM-GLM, data are often collected on a large number of features. However, fitting an FM-GLM to such high-dimensional data is numerically challenging. To cope with the high-dimensionality in estimation, it is often assumed that the model is sparse and only a handful of features are relevant to the analysis. Most of the existing development on sparse estimation is in the context of homogeneous regression or supervised learning problems. In this talk, I will discuss some of the challenges and recent computational and theoretical developments for sparse estimation in FM-GLM when the number of features can be in exponential order of the sample size. Moreover, I will discuss a modified EM algorithm to obtain the estimates in FM-GLM numerically. The convergence theory of the modified EM algorithm for finite mixture of Gaussian regression with Lasso penalty will also be studied.