Taking advantage of sparsity in multi-task learning.
K. Lounici, M. Pontil, A.B. Tsybakov and S. van de Geer
In: COLT 2009(2009).
We study the problem of estimating multiple linear
regression equations for the purpose of both prediction
and variable selection. Following recent
work on multi-task learning, we assume that
the sparsity patterns of the regression vectors are
included in the same set of small cardinality. This
assumption leads us to consider the Group Lasso
as a candidate estimation method. We show that
this estimator enjoys nice sparsity oracle inequalities
and variable selection properties. The results
hold under a certain restricted eigenvalue condition
and a coherence condition on the design matrix.
In particular, in the multi-task learning scenario, in
which the number of tasks can grow, we are able
to remove completely the effect of the number of
predictor variables in the bounds. Finally, we show
how our results can be extended to more general
noise distributions, of which we only require the finite variance of the noise.