June 13-14, Atlanta GA, USA
*SEM (pronounced STAR SEM)
The 2nd Joint Conference on
Lexical and Computational Semantics
Co-located with NAACL-HLT 2013
Paper submission deadline: March, 15 2013
(see below for more details)
----- FIRST CALL FOR PAPERS: *SEM-2013 -----
After the success of *SEM 2012, the first joint conference on lexical and computational semantics co-organized by SIGLEX and SIGSEM, we are pleased to announce *SEM 2013. This event will provide two days of intensive semantics-related talks, posters and task reports and serve as a meeting-point for the lexical/computational semantics community, bringing together researchers interested in any aspects of semantic processing of human language on all levels of granularity and linguistic representation. *SEM embraces both symbolic and probabilistic approaches, and everything in between; theoretical contributions are welcome as well. The long-term goal of *SEM is to provide a stable forum for the growing number of NLP researchers working on semantics.
----- TOPICS OF INTEREST -----
Topics of interest include, but are not limited to:
Formal and linguistic semantics
Cognitive aspects of semantics
Semantic aspects of morphology
Semantic processing of morphologically rich languages
Semantic processing at the sentence level
Semantic processing at the discourse level
Semantic processing of non-propositional aspects of meaning
Multilingual semantic processing
Social media and linguistic semantics
Inference and ontologies
Pragmatics of language
Generation and summarization
----- SUBMISSIONS -----
-- *SEM submissions must describe unpublished work and be written in English. Each submission will be reviewed by at least three program committee members. Final versions should take into account reviewers' comments.
-- We solicit both long and short papers. Long papers may consist of up to eight (8) pages of content, plus two extra pages for references, short papers of up to four (4) plus one extra page for references. Long papers typically describe original research and will be presented orally. Short papers (typically: ongoing research, project or system description, or an opinion piece) will be presented as a poster.
-- As the reviewing will be blind, the paper must not include the authors' names and affiliations. Furthermore, self-references that reveal the author's identity, e.g., "We previously showed (Smith, 1991) ..." must be avoided. Instead, use citations such as "Smith previously showed (Smith, 1991) ..." Papers that do not conform to these requirements will be rejected without review. In addition, please do not post your submissions on the web until after the review process is complete.
Papers should follow the NAACL 2013 guidelines:
Papers should be submitted using the START system:
Note that START has recently moved to centralized account management, meaning you will first need to create a START account (or log-in via your existing START account) to be able to submit your paper.
----- SHARED TASK: Semantic Textual Similarity -----
Every *SEM edition will include a shared task in which training and test data are provided by the organizers, which allows participating systems to be evaluated and compared in a systematic way. Descriptions of the participating systems and an evaluation of their performance will be presented both at the conference and in the proceedings. The same or similar shared tasks may be chosen for several years in a row to focus the research attention of the community on a particular semantic task.
For *SEM 2013, we will be hosting a shared task on "Semantic Textual Similarity"
The goal of the Semantic Textual Similarity (STS) task is to create a unified framework for the evaluation of semantic textual similarity modules and to characterize their impact on NLP applications. STS measures the degree of semantic equivalence between two snippets of text. We are proposing the STS task as an attempt at creating a unified framework that allows for an extrinsic evaluation of multiple semantic components that otherwise have historically tended to be evaluated independently and without characterization of impact on NLP applications.
Please refer to http://ixa2.si.ehu.es/sts/index.php for more information.
----- SEMEVAL: Semantic Evaluation Exercises -----
The Semantic Evaluation (SemEval) series of workshops focuses on the evaluation of semantic analysis systems with the objective of comparing systems that can analyze diverse semantic phenomena in text. SemEval-2013 is an independent SIGLEX sponsored event however it will be associated with *SEM 2013 and NAACL, and will comprise the following tasks:
1. TempEval-3 Temporal Annotation 2. Sentiment Analysis in Twitter 3. Spatial Role Labeling 4. Free Paraphrases of Noun Compounds 5. Evaluating Phrasal Semantics 7. The Joint Student Response Analysis and 8th Recognizing Textual Entailment Challenge 8. Cross-lingual Textual Entailment for Content Synchronization 9. Extraction of Drug-Drug Interactions from BioMedical Texts 10. Cross-lingual Word Sense Disambiguation 11. Evaluating Word Sense Induction & Disambiguation within An End-User Application 12. Multilingual Word Sense Disambiguation 13. Word Sense Induction for Graded and Non-Graded Senses 14. The Coarse-Grained and Fine-Grained Chinese Lexical Sample and All-Words Task
Please refer to http://www.cs.york.ac.uk/semeval-2013/ for more information.
----- IMPORTANT DATES -----
*SEM 2013 papers due: March 15, 23:59 PST (GMT-8) *SEM 2013 acceptance notification: April 12 *SEM 2013 camera ready due: April 21 *SEM 2013 conference: June 13-14
----- CONTACT -----
Mona Diab, General Chair Timothy Baldwin, PC Co-chair Marco Baroni, PC Co-chair
e-mail: sem-organizers at googlegroups com
Mona Diab, PhD
Associate Professor Computer Science Department School of Eng. & Applied Science The George Washington University
-------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: text/html Size: 13978 bytes Desc: not available URL: <https://mailman.uib.no/public/corpora/attachments/20130117/1e98b200/attachment.txt>