In:
PLOS Computational Biology, Public Library of Science (PLoS), Vol. 17, No. 6 ( 2021-6-30), p. e1009086-
Abstract:
Clustering high-dimensional data, such as images or biological measurements, is a long-standing problem and has been studied extensively. Recently, Deep Clustering has gained popularity due to its flexibility in fitting the specific peculiarities of complex data. Here we introduce the Mixture-of-Experts Similarity Variational Autoencoder (MoE-Sim-VAE), a novel generative clustering model. The model can learn multi-modal distributions of high-dimensional data and use these to generate realistic data with high efficacy and efficiency. MoE-Sim-VAE is based on a Variational Autoencoder (VAE), where the decoder consists of a Mixture-of-Experts (MoE) architecture. This specific architecture allows for various modes of the data to be automatically learned by means of the experts. Additionally, we encourage the lower dimensional latent representation of our model to follow a Gaussian mixture distribution and to accurately represent the similarities between the data points. We assess the performance of our model on the MNIST benchmark data set and challenging real-world tasks of clustering mouse organs from single-cell RNA-sequencing measurements and defining cell subpopulations from mass cytometry (CyTOF) measurements on hundreds of different datasets. MoE-Sim-VAE exhibits superior clustering performance on all these tasks in comparison to the baselines as well as competitor methods.
Type of Medium:
Online Resource
ISSN:
1553-7358
DOI:
10.1371/journal.pcbi.1009086
DOI:
10.1371/journal.pcbi.1009086.g001
DOI:
10.1371/journal.pcbi.1009086.g002
DOI:
10.1371/journal.pcbi.1009086.g003
DOI:
10.1371/journal.pcbi.1009086.g004
DOI:
10.1371/journal.pcbi.1009086.t001
DOI:
10.1371/journal.pcbi.1009086.t002
DOI:
10.1371/journal.pcbi.1009086.t003
DOI:
10.1371/journal.pcbi.1009086.s001
DOI:
10.1371/journal.pcbi.1009086.s002
DOI:
10.1371/journal.pcbi.1009086.s003
DOI:
10.1371/journal.pcbi.1009086.s004
DOI:
10.1371/journal.pcbi.1009086.s005
DOI:
10.1371/journal.pcbi.1009086.s006
DOI:
10.1371/journal.pcbi.1009086.s007
DOI:
10.1371/journal.pcbi.1009086.s008
DOI:
10.1371/journal.pcbi.1009086.s009
DOI:
10.1371/journal.pcbi.1009086.s010
DOI:
10.1371/journal.pcbi.1009086.s011
DOI:
10.1371/journal.pcbi.1009086.s012
DOI:
10.1371/journal.pcbi.1009086.r001
DOI:
10.1371/journal.pcbi.1009086.r002
DOI:
10.1371/journal.pcbi.1009086.r003
DOI:
10.1371/journal.pcbi.1009086.r004
Language:
English
Publisher:
Public Library of Science (PLoS)
Publication Date:
2021
detail.hit.zdb_id:
2193340-6
Permalink