An informativeness approach to Open IE evaluation

TitreAn informativeness approach to Open IE evaluation
Type de publicationConference Paper
Année de publication2016
AuteursLéchelle, W., and P. Langlais
ÉditeurGelbukh, A.
Nom de la conférenceCICLing
ÉditeurSpringer
Mots-clésEvaluation, Open information extraction, Question Answering
RésuméOpen Information Extraction (OIE) systems extract relational tuples from text without requiring to specify in advance the relations of interest. Systems perform well on widely used metrics such as precision and yield, but a close look at systems output shows a general lack of informativeness in facts deemed correct. We propose a new evaluation protocol, based on question answering, that is closer to text understanding and end user needs. Extracted information is judged upon its capacity to automatically answer questions about the source text. As a showcase for our protocol, we devise a small corpus of question/answer pairs, and evaluate available state-of-the-art OIE systems on it. Performance-wise, our results are in line with previous findings. Furthermore, we are able to estimate recall for the task, which is novel. We distribute our annotated data and automatic evaluation program.