PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Learning Deep Neural Networks for High Dimensional Output Problems
Benjamin Labbe, Romain Herault and Clement Chatelain
In: ICMLA 2009, 13-15 Dec. 2009, Miami Beach Resort & SPA, 4833 Collins Ave, Miami Beach, Florida, USA.


State-of-the-art pattern recognition methods have difficulty dealing with problems where the dimension of the output space is large. In this article, we propose a new framework based on deep architectures (e.g. Deep Neural Networks) in order to deal with this issue. Deep architectures have proven to be efficient for high dimensional input problems such as image classification, due to their ability to embed the input space. The main contribution of this article is the extension of the embedding procedure to both the input and output spaces in order to easily handle high dimensional output problems. Using this extension, inter-output dependencies can be modelled efficiently. This provides an interesting alternative to probabilistic models such as HMM and CRF. Preliminary experiments on tay datasets and USPS character reconstruction show promising results.

EPrint Type:Conference or Workshop Item (Oral)
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:6648
Deposited By:Romain Herault
Deposited On:08 March 2010