## AbstractA nonparametric Bayesian extension of Independent Com- ponents Analysis (ICA) is proposed where observed data Y is modelled as a linear superposition, G, of a potentially inﬁnite number of hidden sources, X. Whether a given source is active for a speciﬁc data point is speciﬁed by an inﬁnite binary matrix, Z. The resulting sparse rep- resentation allows increased data reduction compared to standard ICA. We deﬁne a prior on Z using the Indian Buﬀet Process (IBP). We de- scribe four variants of the model, with Gaussian or Laplacian priors on X and the one or two-parameter IBPs. We demonstrate Bayesian inference under these models using a Markov Chain Monte Carlo (MCMC) algo- rithm on synthetic and gene expression data and compare to standard ICA algorithms.
[Edit] |