Qwen2.5-Coder Podcast By  cover art

Qwen2.5-Coder

Qwen2.5-Coder

Listen for free

View show details

About this listen

🔷 Qwen2.5-Coder Technical Report

The report introduces the Qwen2.5-Coder series, which includes the Qwen2.5-Coder-1.5B and Qwen2.5-Coder-7B models. These models are specifically designed for coding tasks and have been pre-trained on a massive dataset of 5.5 trillion code-related tokens. A significant focus is placed on data quality, with detailed cleaning and filtering processes, and advanced training techniques such as file-level and repo-level pre-training. The models were rigorously tested on various benchmarks, including code generation, completion, reasoning, repair, and text-to-SQL tasks, where they demonstrated strong performance, even surpassing larger models in some areas. The report concludes with suggestions for future research, such as scaling model size and enhancing reasoning abilities.

📎 Link to paper

adbl_web_global_use_to_activate_T1_webcro805_stickypopup
No reviews yet