[Corpora-List] CALL FOR PARTICIPATION: The Joint Student Response Analysis and RTE-8 Challenge (SemEval-2013 Task 7)

Danilo Giampiccolo giampiccolo at celct.it
Wed Nov 14 22:22:34 CET 2012

Please distribute widely - Apologies for cross-posting




We are pleased to invite participants to the Joint Student Response Analysis and 8th Recognizing Textual Entailment Challenge at SemEval 2012, a common effort of both educational technology and textual inference communities in order to present a unified scientific challenge addressing researchers in both fields. It will be run together with the SEMEVAL-2013 Semantic Evaluation Exercise, co-located with the *SEM and NAACL-2013 conferences.


The goal of the task is to produce an assessment of student answers to explanation and definition questions typically asked in practice exercises, tests or tutorial dialogue. It is linked to the tasks of semantic analysis and recognizing textual entailment in the textual inference community, and essay grading and short answer assessment in the educational NLP community.

Specifically, given a question, a known correct "reference answer" and a 1- or 2-sentence "student answer", the Main task consists of assessing the correctness of a student’s answer at different levels of granularity - either 5-way (correct, partially correct, contradictory, irrelevant, not in the domain), 3-way task (correct, contradictory, incorrect), or 2-way task (correct, incorrect). Participants can opt to carry out the task at any level of granularity.

This task is similar to essay grading and short answer assessment. However, meaningful labels are used instead of numerical scores as a way to extract information necessary to provide effective tutorial feedback with respect to the specific problems that can be found in student answers. Educational NLP and essay grading system developers are encouraged to adapt their scoring techniques to the categorical labeling problem.

The Student Response Analysis task is also closely related to the notion of textual entailment. In most correct answers at least a substantial portion of the reference answer will be entailed by the student answer text, even if not all of it. Thus textual entailment engine developers are encouraged to exploit textual entailment techniques to test the potential contributions of their systems to the student response assessment problem.


An additional pilot task on partial entailment is offered as part of the challenge, where systems may recognize that specific parts of the hypothesis are entailed by the text, even though entailment might be not recognized for the hypothesis as a whole. Such recognition of partial entailment may have various uses in the educational setting based on identifying the missing parts in the student answer, and may similarly have value in other applications such as summarization or question answering.


- November 9, 2012 Training Data available for participants - February 15, 2013 Registration Deadline - March 1 – 15, 2013 Evaluation period [TBC] - April 9, 2013 Paper submission deadline [TBC] - April 23, 2013 Reviews Due [TBC] - May 4, 2013 Camera ready Due [TBC]

More details on the Main and Pilot tasks are available at: http://www.cs.york.ac.uk/semeval-2013/task7/.

If you are interested in the task, please register at SemEval 2013 (http://www.cs.york.ac.uk/semeval-2013/index.php?id=registration).

More information about the Corpora mailing list