LISN Laboratory -University Paris Saclay

In this internship, we assume that the deployment of virtual assistants can be done step by step over different countries in the world and, thus, that virtual assistants will face different languages at different timestamps. This assumption implies that, when designing/training a model for a given task, languages can be incrementally added in the training procedure.

Two preliminary works have been done: 1) [Coria et al., 2022], investigating BERT’s cross-lingual transfer capabilities in two continual sequence labeling tasks. 2) [Gerald and Soulier, 2022] designing continual learning streams for information  retrieval.
In practice, we will focus on the Massively Multilingual NLU 2022 data [FitzGerald et al., 2022], which includes slot-filling and NER tasks for 51 languages in parallel. The objective of the internship will be to 1) build a stream of languages for a given task, 2) run baseline models in the stream, and 3) design a continual learning model for cross-lingual transfer.

To apply for this job please visit