Filtering-ranking perceptron learning for partial parsing
There is a more recent version of this eprint available. Click here to view it.
This work introduces a phrase recognition system based on perceptrons, and a global online learning algorithm to train them together. The method applies to complex domains in which some structure has to be recognized. This global problem is broken down into two layers of local subproblems: a filtering layer, which reduces the search space by identifying plausible phrase candidates, and a ranking layer, which discriminatively builds the optimal phrase structure. A recognition-based feedback rule is presented which reflects to each local function its committed errors from a global point of view, and allows to train them together online as perceptrons. As a result, the learned functions automatically behave as filters and rankers, rather than binary classifiers, which we argue to be better for this type of problems. Extensive experimentation on partial parsing tasks gives state-of-the-art results and evinces the advantages of the global training method over optimizing each function locally, as in the traditional approach.
Available Versions of this Item