An informativeness approach to Open IE evaluation

TitleAn informativeness approach to Open IE evaluation
Publication TypeConference Paper
Year of Publication2016
AuthorsLéchelle, W., and P. Langlais
EditorGelbukh, A.
Conference NameCICLing
PublisherSpringer
KeywordsEvaluation, Open information extraction, Question Answering
AbstractOpen Information Extraction (OIE) systems extract relational tuples from text without requiring to specify in advance the relations of interest. Systems perform well on widely used metrics such as precision and yield, but a close look at systems output shows a general lack of informativeness in facts deemed correct. We propose a new evaluation protocol, based on question answering, that is closer to text understanding and end user needs. Extracted information is judged upon its capacity to automatically answer questions about the source text. As a showcase for our protocol, we devise a small corpus of question/answer pairs, and evaluate available state-of-the-art OIE systems on it. Performance-wise, our results are in line with previous findings. Furthermore, we are able to estimate recall for the task, which is novel. We distribute our annotated data and automatic evaluation program.