1. 📘 Topic and Domain: Large-scale multilingual machine translation focused on both Chinese and English language pairs, covering 60 languages and 234 translation directions.
2. 💡 Previous Research and New Ideas: Based on previous LLM-based translation research but addresses English-centric bias by introducing Chinese as a second pivot language, while proposing Strategic Downsampling and Parallel Multilingual Prompting.
3. ❓ Problem: Addressing the challenges of broad language coverage, consistent translation quality, and English-centric bias in multilingual machine translation systems.
4. 🛠️ Methods: Used a two-stage adaptation framework combining Continued Pre-training (CPT) and Supervised Fine-tuning (SFT), with Strategic Downsampling to prevent directional degeneration and Parallel Multilingual Prompting to enhance cross-lingual transfer.
5. 📊 Results and Evaluation: The 4B model (LMT-60-4B) achieved state-of-the-art performance among comparable models, surpassing larger models like Aya-101-13B and NLLB-54B, with consistent performance across high, medium, and low-resource languages.