Skip to content

Code for the paper "Crosslingual Capabilities and Knowledge Barriers in Multilingual Large Language Models".

License

Notifications You must be signed in to change notification settings

google-research/crosslingual-knowledge-barriers

Repository files navigation

Crosslingual Capabilities and Knowledge Barriers in Multilingual Large Language Models

📃 Paper🤗 Data🤗 Models

Code for the paper "Crosslingual Capabilities and Knowledge Barriers in Multilingual Large Language Models".

This is not an officially supported Google product.

1. Introduction

Large language models (LLMs) are typically multilingual due to pretraining on diverse multilingual corpora. But can these models relate corresponding concepts across languages, effectively being crosslingual?

This study evaluates six state-of-the-art LLMs on inherently crosslingual tasks. We observe that while these models show promising surface-level crosslingual abilities on machine translation and embedding space analyses, they struggle with deeper crosslingual knowledge transfer, revealing a crosslingual knowledge barrier in both general (MMLU benchmark) and domain-specific (Harry Potter quiz) contexts.

We observe that simple inference-time mitigation methods offer only limited improvement. On the other hand, we propose fine-tuning of LLMs on mixed-language data, which effectively reduces these gaps, even when using out-of-domain datasets like WikiText. Our findings suggest the need for explicit optimization to unlock the full crosslingual potential of LLMs.

2. Setup

Create Conda environment.

conda create -n crossknow python=3.9
conda activate crossknow

Install Pytorch. We tested the code with the following version:

conda install pytorch==2.2.1 pytorch-cuda=12.1 -c pytorch -c nvidia

Install other packages.

bash install.sh

3. Getting Started with Experiments

We examine our discovery on a wide range of tasks and release their code in different directories:

Citation

If you find this useful in your research, please consider citing our paper:

@misc{chua2024crosslingual,
      title={Crosslingual Capabilities and Knowledge Barriers in Multilingual Large Language Models}, 
      author={Lynn Chua and Badih Ghazi and Yangsibo Huang and Pritish Kamath and Ravi Kumar and Pasin Manurangsi and Amer Sinha and Chulin Xie and Chiyuan Zhang},
      year={2024},
      eprint={2406.16135},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}

About

Code for the paper "Crosslingual Capabilities and Knowledge Barriers in Multilingual Large Language Models".

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published