202111 May
PhD defence by Yova Radoslavova Kementchedjhieva
General News

Since data are limited for many of the tasks, domains and languages studied in NLP, transfer learning has gained great prominence in the field as a way to alleviate data scarcity. This thesis presents work on methods, evaluations and resources for multilingual transfer learning. Our research shows how to improve and correctly evaluate cross-lingual embeddings obtained through alignment. It sheds light on the source of performance in cross-lingual transfer learning for dependency parsing. And it introduces two new resources for language generation tasks, one best viewed as a test bed for cross-domain transfer methods and the other, as a test bed for meta-learning techniques.

  • AI & Machine learning
  • Search and Information retrieval
  • Web and Content Management
  • Learning Management Systems (LMS)
  • EduTech - learning