*Please, apologies for cross-posting**
* ****************************** CLEF 2018 Avignon, France - September 10-14, 2018 **http://clef2018.clef-initiative.eu/
Call for Labs Proposals
*CLEF is a leading annual international conference exploring topics in information access technologies. CLEF uniquely combines evaluation laboratories and workshops with research presentations, panels, poster and demo sessions. CLEF 2018 (http://clef2018.clef-initiative.eu/) will be held on 10-14 September 2018 at Avignon University, Avignon, France.
Researchers and practitioners from all areas of information access and related communities are invited to submit proposals for evaluation labs for consideration for inclusion in CLEF 2018. Lab proposals will be reviewed by the lab selection committee. Organizers of selected proposals will be invited to include their lab in the CLEF 2018 lab programme, possibly subject to small suggested modifications to their proposal to better suit the CLEF lab workflow or timeline.
The CLEF Initiative (http://www.clef-initiative.eu/) is a self-organized body whose main mission is to promote research, innovation, and development of information access systems with an emphasis on multilingual information in different modalities - including text and multimedia - with various levels of structure. CLEF promotes research and development by providing an infrastructure for:
* independent evaluation of information access systems;
* investigation of the use of unstructured, semi-structured,
highly-structured, and semantically enriched data in information access;
* creation of reusable test collections for benchmarking;
* exploration of new evaluation methodologies and innovative ways of
using experimental data;
* discussion of results, comparison of approaches, exchange of ideas,
and transfer of knowledge.
CLEF 2018 will be the ninth edition of the current CLEF conference launched as a continuation of the annual CLEF evaluation campaign series which ran with great success from 2000 to 2009 contributing to the systematic evaluation of information access systems, primarily through experimentation on shared tasks.
*Scope of CLEF Labs
*Proposals are accepted for two different types of labs (see 2017 Lab at http://clef2017.clef-initiative.eu/index.php?page=Pages/labs_info.php):
* Evaluation Labs that are a "campaign-style" evaluation for specific
information access problems (during the twelve month period
preceding the conference), similar in nature to the traditional CLEF
campaign "tracks". Topics covered by campaign-style labs can be
inspired by any information access-related domain or task.
* Labs that follow a more classical "workshop" pattern, exploring
issues of evaluation methodology, metrics, processes etc. in
information access and closely related fields, such as natural
language processing, machine translation, and human-computer
If the organisers of the proposal are new to CLEF or other shared task evaluation campaigns, we highly recommend that a lab workshop first be organised to discuss the format, the problem space, and the practicalities of the shared task. The CLEF 2018 programme will reserve about half of the conference schedule for lab sessions. The labs will present their overall results "overview presentations" during the plenary scientific paper sessions to allow non-participants to get a sense of where the research frontiers are moving. Organisers of each lab are expected to organise the separate sessions for their lab at the conference so that they contain ample time for general discussion and engagement by all participants - not just those presenting campaign results and papers. Organisers should plan time in their sessions for activities such as panels, demos, poster sessions etc. as appropriate. CLEF is always interested in receiving and facilitating innovative lab proposals. Potential task proposers who are unsure of the suitability of their task proposal or its format for inclusion at CLEF are encouraged to contact the CLEF 2018 Lab Organizing Committee Chairs to discuss its suitability or design at an early stage.
*Lab proposals must provide sufficient information to judge the relevance, timeliness, scientific quality, benefits for the research community, and the competence of the proposers to co-ordinate the lab. Each lab proposal should identify one or more organizers as responsible for ensuring the timely execution of the lab. Proposals should be 3-4 pages long and should provide the following information:
1. Title of the proposed lab.
2. A brief description of the lab topic and goals, its relevance to
CLEF and the significance for the field.
3. A brief but clear statement of usage scenarios or domain to which
the activity is intended to contribute, including the evaluation
setup and metrics.
4. Details on the lab organizer(s), including identifying the task
chair(s) responsible for ensuring the running of the task. This
should include details of any previous of involvement in organising
or participating in evaluation tasks at CLEF or similar campaigns.
5. The planned format of the lab, i.e. campaign-style ("track") or
6. Is the lab a continuation of an activity from previous year(s) or a
a. For activities continued from previous year(s): Statistics
from previous years (number of participants/runs for each task), a
clear statement on why another edition is needed, an explicit
listing of the changes proposed, and a discussion of lessons to be
learned or insights to be made.
b. For new activities: A statement on why a new evaluation
campaign is needed and how the community would benefit from the
7. Details of the expected target audience, i.e. who do you expect to
participate in the task(s), and how do you propose to reach them.
8. Brief details of tasks to be carried out in the lab. The proposal
should clearly motivate the need for each of the proposed tasks and
provide evidence of its capability of attracting enough
participation. The dataset which will be adopted by the Lab needs to
be described and motivated in the perspective of the goals of the
Labs; also indications on how the dataset will be shared are useful.
It is fine for a lab to have a single task, but labs often contain
multiple closely related tasks, needing a strong motivation for more
than 3 tasks, to avoid useless fragmentation.
9. Expected length of the lab session at the conference: half-day, one
day, two days. This should include very high level details of
planned structure of the session, e.g. participant presentations,
invited speaker(s), panels, etc., to justify the requested session
length. 10. Arrangements for the organization of the lab campaign: who will be
responsible for activities within the task; how will data be
acquired or created, what tools or methods will be used, e.g., how
will necessary queries be created or relevance assessment carried
out; any other information which is relevant to the conduct of your lab. 11. If the lab proposes to set up a steering committee to oversee and
advise its activities, include names, addresses, and homepage links
of people you propose to be involved.
*Each submitted proposal will be reviewed by the CLEF 2018 lab organizing committee. The decision will be sent by email to the responsible organizer by September 13, 2017. The final length of the lab session will be determined based on the overall organization of the conference and the number of participant submissions received by a lab. Organizers of accepted labs are expected to announce and briefly present their lab at the final session of CLEF 2017 on the 14 September in Dublin, Ireland.
* Final lab proposals: 3 September 2017
* Notification of lab acceptance: 13 September 2017
* Labs registration opens: 3 November 2017
*Lab proposals (or questions) should be submitted in PDF format via EasyChair at the following link: https://easychair.org/conferences/?conf=clef2018. Questions on lab proposals should be directed to the CLEF 2018 Lab Chairs: ● Jian-Yun Nie (nie at iro.umontreal.ca) ● Laure Soulier (laure.soulier at lip6.fr) Further information on CLEF 2018 is available at: http://clef2018.clef-initiative.eu/.
-------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: text/html Size: 13686 bytes Desc: not available URL: <https://mailman.uib.no/public/corpora/attachments/20170711/b986b86e/attachment.txt>