Just-Go-Parallel (Parallel Last (uni): EN→ZH)
The model repository for the "Parallel Last (uni): EN→ZH" setting of the following paper:
Just Go Parallel: Improving the Multilingual Capabilities of Large Language Models
Muhammad Reza Qorib, Junyi Li, and Hwee Tou Ng
The 63rd Annual Meeting of the Association for Computational Linguistics (to appear)
- Paper: arXiv
- Codebase: https://github.com/nusnlp/Just-Go-Parallel/
We use the architecture and tokenizer of TinyLlama v1.1. Please use transformers>=4.35.
Models
The main branch of the repository contains the best-performing model that was evaluated in the paper. Other checkpoints produced during training will also be hosted in this repository under different branch names (also called "revisions" in HuggingFace), with each branch name indicating the number of training steps.
- No Parallel: nusnlp/JGP-No-Parallel
- Multilingual: nusnlp/JGP-Multilingual
- Parallel Non-Adjacent: nusnlp/JGP-Parallel-Non-Adjacent
- Parallel First: nusnlp/JGP-Parallel-First
- Parallel Distributed: nusnlp/JGP-Parallel-Distributed
- Parallel Last (all): nusnlp/JGP-Parallel-Last-all
- Parallel Last (uni):
- EN→ID: nusnlp/JGP-Parallel-Last-EN-ID
- ID→EN: nusnlp/JGP-Parallel-Last-ID-EN
- EN→ZH: nusnlp/JGP-Parallel-Last-EN-ZH
- ZH→EN: nusnlp/JGP-Parallel-Last-ZH-EN
- Downloads last month
- 58
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support