Covariate shift adaptation by importance weighted cross validation
Masashi Sugiyama, Matthias Krauledat and Klaus-Robert Müller
J. Machine Learning Res.
A common assumption in supervised learning is that the input points in the training set follow
the same probability distribution as the input points that will be given in the future test phase.
However, this assumption is not satisﬁed, for example, when the outside of the training region is
extrapolated. The situation where the training input points and test input points follow different
distributions while the conditional distribution of output values given input points is unchanged
is called the covariate shift. Under the covariate shift, standard model selection techniques such
as cross validation do not work as desired since its unbiasedness is no longer maintained. In this
paper, we propose a new method called importance weighted cross validation (IWCV), for which
we prove its unbiasedness even under the covariate shift. The IWCV procedure is the only one
that can be applied for unbiased classiﬁcation under covariate shift, whereas alternatives to IWCV
exist for regression. The usefulness of our proposed method is illustrated by simulations, and
furthermore demonstrated in the brain-computer interface, where strong non-stationarity effects
can be seen between training and test sessions.