PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Tensor-based Total Bregman Divergences between Graphs
Francisco Escolano, Meizhu Lu and Edwin Hancock
In: ICCV 2011 Workshop: 1st IEEE Workshop in Information Theory in Computer Vision and Pattern Recognition, 13 November 2011, Barcelona, Spain.


The accurate and effective measurement of graphsimilarity has proved to be a challenging problem in structural pattern recognition. In this paper we extend the node coverage approach for graph indexing, which outperforms entropic manifold alignment. The proposed extension relies on replacing the Henze-Penrose divergence by a Total Bregman Divergence (TBD) which relies on error free distances (Frobenius norms) for the tensors of the common tangent space. To that end we exploit linear combinations of Gaussians which can be computed faster than the minimum spanning trees needed for obtaining the Henze- Penrose divergence. In the paper we also propose several divergences for these variables (linear combinations): Jeffreys TBD, Jensen-Shannon TBD and Jensen-R´enyi. In our experiments we show that all of these divergences are highly discriminative: all of them improve the retrieval-recall results obtained with the Henze-Penrose divergence within the node coverage approach, being the Jeffreys TBD divergence the best.

PDF - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type:Conference or Workshop Item (Paper)
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
ID Code:8577
Deposited By:Edwin Hancock
Deposited On:12 February 2012