A postdoctoral researcher position to work on conversational agents able to interpret coreference and reference and learning via interaction with players in games will become available at Queen Mary University's Computational Linguistics Lab from 1st of June 2022. The position will be within the context of ARCIDUCA, a joint project between Queen Mary University and the University of Essex. In ARCIDUCA we will develop conversational agents (CAs) able to interpret coreference and reference, and embed them into games so that they can learn through interaction with players.
About the role
The primary responsibility of the researcher will be to develop and train architectures for conversational agents that can interpret coreference and reference which can be embedded in virtual worlds such as MineCraft.
About you
We seek outstanding researchers with a genuine interest in conversational agents, human-in-the-loop learning, and coreference / reference, and ideally all of the above. The ideal candidates will have an extensive background in deep learning methods applied to NLP, in at least one of the areas above. We would prefer candidates who already have a PhD in conversational agents, machine learning, or natural language processing; such candidates will be considered for a Post-doctoral position. Candidates who are essentially done with the PhD work and only waiting for the defense/viva will also be considered, initially for a Research Assistant position, that can be turned into a post-doctoral position once the candidate receives their PhD.
The post
The post is based at the Mile End Campus in London. It is a full-time, fixed-term appointment for up to 32 months available from June 2022 or as soon as feasible. For a Research Assistant the annual starting salary will be in the range of £32,087-£33,824 per annum; for a Postdoctoral Assistant it will be between £34,733- £38,655 per annum. Inclusive of London Allowance.
Research Environment
The researcher will work with Massimo Poesio, Julian Hough, Diego Perez-Llebana, Matt Purver, and Chris Madge from the Cogsci and Game AI groups at Queen Mary University of London; and Richard Bartle, Jon Chamberlain, and Juntao Yu from the NLP group at the University of Essex.
The Cognitive Science Research Group at Queen Mary University of London (CogSci) is a leader in the areas of conversational agents and dialogue, coreference and reference, and the use of games-with-a-purpose (GWAPs) to collect data for NLP. With 9 faculty members, 10 postdocs and 27 PhD students, Cogsci’s Computational Linguistics Lab (http://compling.eecs.qmul.ac.uk/) is one of the largest NLP labs in the UK.
The Game AI Research Group, Queen Mary University of London (GAIG) (https://gaigresearch.github.io/) consists of 8 faculty members, 2 postdocs and 25 PhD students and is the largest Game AI group in UK and a leader in research on developing intelligent artificial agents able to act as non-playing actors in games. GAIG also runs the Intelligent Games and Games Intelligence (IGGI; EP/S022325/1) Doctoral Training Centre.
The Natural Language and Information Processing group at the University of Essex (NLIP) (https://essexnlip.uk/) is one of the oldest NLP groups in the UK, and has recently become a pioneer in the areas of crowdsourcing, in particular with games with a purpose such as Phrase Detectives (http://www.phrasedetectives.com_, the result of a collaboration between NLIP and the equally long-established Essex Games and AI group, which developed MUD, the oldest virtual world in existence.
The wider context
London is a vibrant town and one of the most active centres of AI/NLP research in the world. It is the location of the Turing AI Institute, of several Universities active in AI and NLP, and of numerous AI companies both large and small.
Contact
Please contact Massimo Poesio (m.poesio at qmul.ac.uk) for more details.
http://www.massimopoesio.org http://www.dali-ambiguity.org
For further information and for applying
https://www.qmul.ac.uk/jobs/vacancies/items/6544.html
Deadline
Closing date for applications is 25 April 2022.