The generalized universal law of generalization, ## AbstractIt has been argued by Shepard that there is a robust psychological law that relates the distance between a pair of items in psychological space and the probability that they will be perceived as similar. Specifically, this probability is a negative exponential function of the distance %Paul: changed from: %probability that they will be confused with each other. Specifically, the %probability of confusion is a negative exponential function of the distance between the pair of items. In experimental contexts, distance is typically defined in terms of a multidimensional space---but this assumption seems unlikely to hold for complex stimuli. We show that, nonetheless, the Universal Law of Generalization can be derived in the more complex setting of arbitrary stimuli, using a much more universal measure of distance. This universal distance is defined as the length of the shortest program that transforms the representations of the two items of interest into one another: The algorithmic information distance. It is universal in the sense that it minorizes every computable distance: It is the smallest computable distance. We show that the Universal Law of Generalization holds with probability going to one---provided the probabilities concerned are computable. We also give a mathematically more appealing form of the Universal Law.
[Edit] |