Display options
Share it on

IEEE Trans Neural Netw Learn Syst. 2018 Apr;29(4):1187-1199. doi: 10.1109/TNNLS.2016.2619061. Epub 2017 Feb 27.

Simultaneous Bayesian Clustering and Feature Selection Through Student's ${t}$ Mixtures Model.

IEEE transactions on neural networks and learning systems

Jianyong Sun, Aimin Zhou, Simeon Keates, Shengbin Liao

PMID: 28362615 DOI: 10.1109/TNNLS.2016.2619061

Abstract

In this paper, we proposed a generative model for feature selection under the unsupervised learning context. The model assumes that data are independently and identically sampled from a finite mixture of Student's distributions, which can reduce the sensitiveness to outliers. Latent random variables that represent the features' salience are included in the model for the indication of the relevance of features. As a result, the model is expected to simultaneously realize clustering, feature selection, and outlier detection. Inference is carried out by a tree-structured variational Bayes algorithm. Full Bayesian treatment is adopted in the model to realize automatic model selection. Controlled experimental studies showed that the developed model is capable of modeling the data set with outliers accurately. Furthermore, experiment results showed that the developed algorithm compares favorably against existing unsupervised probability model-based Bayesian feature selection algorithms on artificial and real data sets. Moreover, the application of the developed algorithm on real leukemia gene expression data indicated that it is able to identify the discriminating genes successfully.

Publication Types