[Corpora-List] [Final CFP - Sep 30]: CtrlGen: Controllable Generative Modeling in Language and Vision (co-located with NeurIPS 2021)

Joel Tetreault tetreaul at gmail.com
Sun Sep 26 03:57:31 CEST 2021

(apologies for cross-posting)

Excited by generation, control, and disentanglement, in either language or vision? Then check out our controllable generation workshop <https://ctrlgenworkshop.github.io/> (CtrlGen) which takes place virtually at NeurIPS 2021 <https://nips.cc/Conferences/2021/Dates> on December 13th! We feature an exciting lineup of speakers, a live QA and panel session, interactive activities, and networking opportunities. We are also inviting paper and demo submissions (read further for details).

*Workshop Website*: https://ctrlgenworkshop.github.io/ *Contact*: ctrlgenworkshop at gmail.com

*Important Dates (deadlines are 11:59pm AOE):*

- Paper Submission Deadline: *September 30, 2021*

- Demo Submission Deadline: *October 29, 2021*

- Workshop Date: *December 13, 2021*

*Submission portal:* https://cmt3.research.microsoft.com/CtrlGen2021/Submission/Index

*Full Call for Papers*: https://ctrlgenworkshop.github.io/CFP.html Paper submission deadline: *September 30, 2021*. Topics of interest include: *Methodology and Algorithms:*

- New methods and algorithms for controllability.

- Improvements of language and vision model architectures for


- Novel loss functions, decoding methods, and prompt design methods for


*Applications and Ethics:*

- Applications of controllability including creative AI, machine

co-creativity, entertainment, data augmentation (for text and vision),

ethics (e.g. bias and toxicity reduction), enhanced training for

self-driving vehicles, and improving conversational agents.

- Ethical issues and challenges related to controllable generation

including the risks and dangers of deepfake and fake news.

*Tasks (a few examples):*

- Semantic text exchange

- Syntactically-controlled paraphrase generation

- Persona-based text generation

- Style-sensitive generation or style transfer (for text and vision)

- Image synthesis and scene representation in both 2D and 3D

- Cross-modal tasks such as controllable image or video captioning and

generation from text

- New and previously unexplored controllable generation tasks!

*Evaluation and Benchmarks*

- New and improved evaluation methods and metrics for controllability

- Standard and unified metrics and benchmark tasks for controllability

*Cross-Domain and Other Areas*

- Work in interpretability, disentanglement, robustness, representation

learning, etc.

*Position and Survey Papers*

- For example, exploring problems and lacunae in current controllability

formulations, neglected areas in controllability, and the unclear and

non-standardized definition of controllability

*Paper Submission Instructions:* Papers will be submitted using our CMT submission portal <https://cmt3.research.microsoft.com/CtrlGen2021/Submission/Index>. Submissions should be a single .pdf file that is fully anonymized, with up to 8 pages of content and unlimited references and appendices, following the NeurIPS style template. Supplementary material in the form of code and small data files can be submitted separately as a single .zip file. Accepted papers will be presented as posters and hosted on our workshop website. Note that the workshop is non-archival. While original submissions are preferred, we also welcome works currently under review, but discourage papers already accepted and published elsewhere, including at the NeurIPS main conference. *We especially encourage submissions from those with diverse backgrounds, such as minority or underrepresented groups and junior researchers.*

*Full Call for Demonstrations*: https://ctrlgenworkshop.github.io/demos.html Submission deadline: *October 29, 2021*. Demos of all forms are welcome: research-related, demos of products, interesting and creative projects, etc. We are looking for creative, well-presented, and attention-grabbing demos. Examples include:

- Creative AI such as controllable poetry, music, image, and video

generation models.

- Style transfer for both text and vision.

- Interactive chatbots and assistants that involve controllability.

- Controllable language generation systems, e.g. using GPT-2 or GPT-3.

- Controllable multimodal systems such as image and video captioning or

generation from text.

- Controllable image and video/graphics enhancement systems.

- Systems for controlling scenes/environments and applications for

self-driving vehicles.

- Controllability in the form of deepfake and fake news, specifically

methods to combat them.

- And much, much more…

*Demo Submission Instructions:* Please record a brief (e.g. 3-5 minute) video showcasing and explaining your demo. Demonstrations will be submitted using our CMT submission portal <https://cmt3.research.microsoft.com/CtrlGen2021/Submission/Index> in a single .zip file (containing the recording). Accepted demonstrations will be presented during our workshop and hosted on our workshop website.

Organizers: Steven Feng <https://styfeng.github.io/> (CMU) Varun Gangal <https://vgtomahawk.github.io/> (CMU) Drew Hudson <https://cs.stanford.edu/people/dorarad/> (Stanford) Tatsunori Hashimoto <https://thashim.github.io/> (Stanford) Anusha Balakrishnan <https://www.microsoft.com/en-us/research/people/anbalak/> (Microsoft Semantic Machines) Dongyeop Kang <http://v/> (UMN) Joel Tetreault <https://www.cs.rochester.edu/~tetreaul/academic.html>

(Dataminr) -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: text/html Size: 8510 bytes Desc: not available URL: <https://mailman.uib.no/public/corpora/attachments/20210925/d68389b6/attachment.txt>

More information about the Corpora mailing list