[Corpora-List] 2nd CFP: ACM RecSys 2013 Workshop on Reproducibility and Replication in Recommender Systems Evaluation (RepSys 2013)

Alejandro Bellogin Kouki alejandro.bellogin at uam.es
Mon Jul 8 20:17:50 CEST 2013


[Apologies if you receive this more than once]

===============================================================================

ACM RecSys Workshop on

Reproducibility and Replication in Recommender Systems Evaluation - RepSys2013

7th ACM Recommender Systems Conference (RecSys 2013)

Hong Kong, China, 12 October 2013

http://repsys.project.cwi.nl

===============================================================================

* Submission deadline: 22 July 2013 *

* Keynote speaker confirmed: Mark Levy, Senior Data Scientist at Mendeley *

== Scope ==

Experiment replication and reproduction are key requirements for empirical

research methodology, and an important open issue in the field of Recommender

Systems. When an experiment is repeated by a different researcher and exactly

the same result is obtained, we can say the experiment has been replicated. When

the results are not exactly the same but the conclusions are compatible with the

prior ones, we have a reproduction of the experiment. Reproducibility and

replication involve recommendation algorithm implementations, experimental

protocols, and evaluation metrics. While the problem of reproducibility and

replication has been recognized in the Recommender Systems community, the need

for a clear solution remains largely unmet, which motivates the present

workshop.

== Topics ==

We invite the submission of papers reporting original research, studies,

advances, experiences, or work in progress in the scope of reproducibility and

replication in Recommender Systems evaluation. Papers explicitly dealing with

replication of previously published experimental conditions/algorithms/metrics

and the resulting analysis are encouraged. In particular, we seek discussions on

the difficulties the authors may find in this process, along with their

limitations or successes on reproducing the original results.

The topics the workshop seeks to address include –though need not be limited to–

the following:

* Limitations and challenges of experimental reproducibility and replication

* Reproducible experimental design

* Replicability of algorithms

* Standardization of metrics: definition and computation protocols

* Evaluation software: frameworks, utilities, services

* Reproducibility in user-centric studies

* Datasets and benchmarks

* Recommender software reuse

* Replication of already published work

* Reproducibility within and across domains and organizations

* Reproducibility and replication guidelines

== Submission ==

Two submission types are accepted: long papers of up to 8 pages, and short

papers up to 4 pages. The papers will be evaluated for their originality,

contribution significance, soundness, clarity, and overall quality. The interest

of contributions will be assessed in terms of technical and scientific findings,

contribution to the knowledge and understanding of the problem, methodological

advancements, or applicative value. Specific contributions focusing on

repeatability and reproducibility in terms of algorithm implementations,

evaluation frameworks and/or practice will also be welcome and valued.

All submissions shall adhere to the standard ACM SIG proceedings format:

http://www.acm.org/sigs/publications/proceedings-templates.

Submissions shall be sent as a pdf file through the online submission system now

open at: https://www.easychair.org/conferences/?conf=repsys2013.

== Important dates ==

* Paper submission deadline: 22 July

* Notification: 16 August

* Camera-ready version due: 30 August

== Organizers ==

* Alejandro Bellogín, Centrum Wiskunde & Informatica, The Netherlands

* Pablo Castells, Universidad Autónoma de Madrid, Spain

* Alan Said, Centrum Wiskunde & Informatica, The Netherlands

* Domonkos Tikk, Gravity R&D, Hungary



More information about the Corpora mailing list