Abstract
In practice, multivariate skew normal mixture (MSNM) models provide a more flexible framework than multivariate normal mixture models, especially for heterogeneous and asymmetric data. For MSNM models, the maximum likelihood estimator often leads to a statistical inference referred to as “badness” under certain properties, because of the unboundedness of the likelihood function and the divergence of shape parameters. We consider two penalties for the log-likelihood function to counter these issues simultaneously in MSNM models. We show that the penalized maximum likelihood estimator is strongly consistent when the putative order of the mixture is equal to or larger than the true order. We also provide penalized expectation-maximization-type algorithms to compute penalized estimates. Finite sample performance is examined through simulations, real data applications, and comparison with existing methods.
| Original language | English |
|---|---|
| Pages (from-to) | 8280-8305 |
| Number of pages | 26 |
| Journal | Communications in Statistics - Theory and Methods |
| Volume | 52 |
| Issue number | 23 |
| DOIs | |
| State | Published - 2023 |
| Externally published | Yes |
Bibliographical note
Publisher Copyright:© 2022 Taylor & Francis Group, LLC.
Keywords
- Consistency
- EM-algorithm
- finite mixture
- penalized likelihood
- skewness
ASJC Scopus subject areas
- Statistics and Probability