PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Gaussian Processes and its Application to the design of Digital Communication Receivers
Pablo Olmos, Juan Jose Murillo-Fuentes and Fernando Perez-Cruz
In: Application of Machine Learning (2010) InTech , pp. 181-206. ISBN 978-953-307-035-3

Abstract

In this chapter, we introduce Gaussian processes for machine learning and their application to designing digital communication receivers. Gaussian processes for machine learning are Bayesian nonlinear tools for solving regression and classification problems. Gaussian processes for regression (GPR) were introduced in the mid-nineties to solve nonparametric estimation problems from a Bayesian perspective. They place a Gaussian process (GP) prior over the possible regressors and use the available data to obtain a posterior regressor, which it is able to explain the observations without overfitting. The covariance matrix of the GP prior describes the different solutions that can be achieved, e.g. linear, polynomial, or universal regressors. The solution of GPR is analytical given its covariance function and, besides providing point estimates, it also assigns confidence intervals for the predictions. Furthermore, the covariance function can be optimized by maximum likelihood to better represent the data, which adds additional flexibility to our regression approximation. GPR can be generalized to solve classification problems, namely Gaussian processes for classification (GPC). GPC extends the idea of GPR for a classification likelihood model. For this likelihood, the GPC posterior is no longer analytically tractable and we need to approximate it. Expectation Propagation (EP), which matches the mean and covariance of the GP posterior to a Gaussian distribution, is the most widely used approximation. Unlike most state-of-the-art classifiers, GPC does not return point-wise decisions, but it provides an accurate posterior probability for each classification decision. This is a major advantage to be exploited by subsequent applications for reducing the base error produced by our nonlinear classifiers. Nonlinear regression and classification techniques have been widely used for designing digital communication receivers for nonlinear channels or whenever there is little information about the channel model or for nonlinear model. These nonlinear tools must use short training sequences to learn the channel and to adapt to a wide range of scenarios, from linear minimum phase to nonlinear and non-minimum phase and from single to multi-user scenarios. In this framework, Gaussian processes for machine learning can be used instead of other nonlinear tools, such as neural networks or support vector machines, providing several advantages to these widespread techniques. First, their structure can be learnt by maximum likelihood. Hence, we avoid cross-validation techniques; which reduces the number of training samples needed to provide accurate predictions. And, at the same time, we may learn more parameters compared to other state-of-the-art techniques, i.e., we have more flexible models that can easily resort from linear to intricate nonlinear solutions. Second, they provide accurate posterior probability estimates that can be exploited by the channel decoder to reduce the overall error rate of our communication system. We analytically study how Gaussian processes for machine learning can replace other nonlinear techniques for designing the all-important digital communication receiver. We also present some covariance matrices suitable for general digital communication channels. We illustrate our theoretical results by showing how GPR provides accurate solutions to the channel equalization and multi-user detection problems with very short training sequences and how a low-density parity-check (LDPC) channel decoder might benefit from a GPC equalizer that provides accurate posterior probability estimates.

EPrint Type:Book Section
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:7548
Deposited By:Fernando Perez-Cruz
Deposited On:17 March 2011