Skip to Main Content (Press Enter)

Logo CNR
  • ×
  • Home
  • People
  • Outputs
  • Organizations
  • Expertise & Skills

UNI-FIND
Logo CNR

|

UNI-FIND

cnr.it
  • ×
  • Home
  • People
  • Outputs
  • Organizations
  • Expertise & Skills
  1. Outputs

Evaluation of Jensen-Shannon distance over sparse data

Conference Paper
Publication Date:
2013
abstract:
Jensen-Shannon divergence is a symmetrised, smoothed version of Küllback-Leibler. It has been shown to be the square of a proper distance metric, and has other properties which make it an excellent choice for many high-dimensional spaces in R*. The metric as defined is however expensive to evaluate. In sparse spaces over many dimensions the Intrinsic Dimensionality of the metric space is typically very high, making similarity-based indexing ineffectual. Exhaustive searching over large data collections may be infeasible. Using a property that allows the distance to be evaluated from only those dimensions which are non-zero in both arguments, and through the identification of a threshold function, we show that the cost of the function can be dramatically reduced. © 2013 Springer-Verlag.
Iris type:
04.01 Contributo in Atti di convegno
Keywords:
similarity search
List of contributors:
Cardillo, FRANCO ALBERTO; Rabitti, Fausto
Authors of the University:
CARDILLO FRANCO ALBERTO
Handle:
https://iris.cnr.it/handle/20.500.14243/339508
Book title:
Similarity Search and Applications
  • Overview

Overview

URL

http://www.scopus.com/record/display.url?eid=2-s2.0-84886443337&origin=inward
  • Use of cookies

Powered by VIVO | Designed by Cineca | 26.5.0.0 | Sorgente dati: PREPROD (Ribaltamento disabilitato)