Skip to Main Content (Press Enter)

Logo CNR
  • ×
  • Home
  • People
  • Outputs
  • Organizations
  • Expertise & Skills

UNI-FIND
Logo CNR

|

UNI-FIND

cnr.it
  • ×
  • Home
  • People
  • Outputs
  • Organizations
  • Expertise & Skills
  1. Outputs

Evaluating information extraction

Conference Paper
Publication Date:
2010
abstract:
The issue of how to experimentally evaluate information extraction (IE) systems has received hardly any satisfactory solution in the literature. In this paper we propose a novel evaluation model for IE and argue that, among others, it allows (i) a correct appreciation of the degree of overlap between predicted and true segments, and (ii) a fair evaluation of the ability of a system to correctly identify segment boundaries. We describe the properties of this models, also by presenting the result of a re-evaluation of the results of the CoNLL'03 and CoNLL'02 Shared Tasks on Named Entity Extraction.
Iris type:
04.01 Contributo in Atti di convegno
Keywords:
Information Search and Retrieval; Natural Language Processing; Experimental evaluation; Information Extraction; Wrapper induction
List of contributors:
Esuli, Andrea; Sebastiani, Fabrizio
Authors of the University:
ESULI ANDREA
SEBASTIANI FABRIZIO
Handle:
https://iris.cnr.it/handle/20.500.14243/52927
  • Overview

Overview

URL

http://www.springerlink.com/content/n433t630q3178540
  • Use of cookies

Powered by VIVO | Designed by Cineca | 26.5.0.0 | Sorgente dati: PREPROD (Ribaltamento disabilitato)