Back To Basics: Towards Better Operand Recognition for Multilingual Language Modelling KU Leuven
The field of multilingual NLP is booming. In no small part, this is due to pretrained large multilingual language models (e.g. mBERT and XLM-R) which have been found to have surprising cross-lingual transfer capabilities. The success of these models has limitations however: multiple works have shown that such transfer works best between typologically related languages. Moreover, it is becoming exceedingly clear that these models are ...