Universal Consistency and Bloat in GP : Some theoretical considerations about Genetic Programming from a Statistical Learning Theory viewpoint
## AbstractIn this paper, we provide an analysis of Genetic Programming (GP) from the Statistical Learning Theory viewpoint in the scope of symbolic regression. Firstly, we are interested in Universal Consistency, i.e. the fact that the solution minimizing the empirical error does converge to the best possible error when the number of examples goes to inﬁnity, and secondly, we focus our attention on the uncontrolled growth of program length (i.e. bloat), which is a well-known problem in GP. Results show that (1) several kinds of code bloats may be identiﬁed and that (2) Universal consistency can be obtained as well as avoiding bloat under some conditions. We conclude by describing an ad hoc method that makes it possible simultaneously to avoid bloat and to ensure universal consistency.
[Edit] |