[Corpora-List] Fully funded PhD fellowship in Computational Linguistics, Paris, France

Marie Candito marie.candito at gmail.com
Mon Apr 26 08:49:01 CEST 2021


*https://en.labex-efl.fr/post/fully-funded-phd-position-in-computational-linguistics* <https://en.labex-efl.fr/post/fully-funded-phd-position-in-computational-linguistics>

*Fully-funded PhD position in computational linguistics at EFL LabEx in Paris* *“Articulation* *of* *a* *compositional* *model* *of* *negation* *with* *distributional* *semantics”*

The EFL LabEx (*Laboratory* *of* *Excellence* *“Empirical* *Foundations* *in* *Linguistics’’* is offering a 3 year PhD grant in *computational linguistics*, full time, about net 1700 euros/month, starting in the *Fall* *2021* at the University Sorbonne Nouvelle (Paris).

*## Topic of the PhD*

The PhD topic falls within the strand 5 of the LabEx *“Computational Semantic Analysis”* <https://en.labex-efl.fr/post/axe-5-analyse-s%C3%A9mantique-computationnelle>. We propose to work on the representation of negation within pretrained transformer-based language models (Devlin et al., 2019). The aim will be to study what form of compositionality emerges in these models, in particular for the treatment of negation (in the same vein as (Ettinger, 2019)), and what improvements can be made, in particular with the addition of self-supervised tasks requiring more sophisticated linguistic information than the simple linear order (such as the addition of syntactic information by (Xu et al., 2020)).

*## Profile *

The candidate must have a Master degree by November 2021, with a specialization in computational linguistics or NLP. The candidate is expected to have some previous knowledge of deep learning algorithms for NLP (including pre-trained transformer-based language models) and in semantics.

He or she will be affiliated to the laboratory Lattice ( https://www.lattice.cnrs.fr/), and to the graduate school ED 622 ( http://www.univ-paris3.fr/ed-622-sciences-du-langage-3413.kjsp). Attendance to laboratory and doctoral seminars is expected; it might also be possible to teach classes in linguistics or computational linguistics.

*## Application*

The application file should be sent by May 9 (midnight MET) to pascal.amsili at ens.fr and marie.candito at u-paris.fr It should comprise: - a CV (max 5 pages) with transcripts (Master), diplomas, internships - a cover letter - the names and contact of two referees for reference letters

The candidates selected for auditions will send their Master thesis or other written work supporting their qualification for the project. They will be interviewed (remotely) between the end of May and mid-June 2021

*## References*

Devlin, J., Chang M.-W., Lee K. and Toutanova K. (2019). “BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding” In NAACL-HLT 2019. Ettinger, A. (2019). What bert is not : Lessons from a new suite of psycholinguistic diagnostics for language models. *Transactions* *of* *the* *Association* *for* *Computational* *Linguistics*, 8:34–48. Xu Z., Guo D., Tang D., Su Q., Shou L., Gong M., Zhong W., Quan X., Duan N., and Jiang D. (2020) “Syntax-Enhanced Pre-trained Model”. arXiv:2012.14116 -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: text/html Size: 5569 bytes Desc: not available URL: <https://mailman.uib.no/public/corpora/attachments/20210426/533c8693/attachment.txt>



More information about the Corpora mailing list