Detail výsledku
Soft Language Prompts for Language Transfer
Cross-lingual knowledge transfer, especially between high- and low-resource
languages, remains challenging in natural language processing (NLP). This study
offers insights for improving cross-lingual NLP applications through the
combination of parameter-efficient fine-tuning methods. We systematically explore
strategies for enhancing cross-lingual transfer through the incorporation of
language-specific and task-specific adapters and soft prompts. We present
a detailed investigation of various combinations of these methods, exploring
their efficiency across 16 languages, focusing on 10 mid- and low-resource
languages. We further present to our knowledge the first use of soft prompts for
language transfer, a technique we call soft language prompts. Our findings
demonstrate that in contrast to claims of previous work, a combination of
language and task adapters does not always work best; instead, combining a soft
language prompt with a task adapter outperforms most configurations in many
cases.
cross-lingual transfer, multilinguality, less-resourced languages, language
representations
@inproceedings{BUT194218,
author="Ivan {Vykopal} and {} and {} and Marián {Šimko}",
title="Soft Language Prompts for Language Transfer",
year="2025",
pages="10294--10313",
publisher="Association for Computational Linguistics",
address="Albuquerque",
doi="10.18653/v1/2025.naacl-long.517",
isbn="979-8-8917-6189-6",
url="https://aclanthology.org/2025.naacl-long.517/"
}