Qwen2 is a new series of large language models from Alibaba group

tools 0.5b 1.5b 7b 72b

4M 2 months ago

Readme

Qwen2 is trained on data in 29 languages, including English and Chinese.

It is available in 4 parameter sizes: 0.5B, 1.5B, 7B, 72B.

In the 7B and 72B models, context length has been extended to 128k tokens.

Models Qwen2-0.5B Qwen2-1.5B Qwen2-7B Qwen2-72B
Params 0.49B 1.54B 7.07B 72.71B
Non-Emb Params 0.35B 1.31B 5.98B 70.21B
GQA True True True True
Tie Embedding True True False False
Context Length 32K 32K 128K 128K

Supported languages

This is in addition to English and Chinese

Regions Languages
Western Europe German, French, Spanish, Portuguese, Italian, Dutch
Eastern & Central Europe Russian, Czech, Polish
Middle East Arabic, Persian, Hebrew, Turkish
Eastern Asia Japanese, Korean
South-Eastern Asia Vietnamese, Thai, Indonesian, Malay, Lao, Burmese, Cebuano, Khmer, Tagalog
Southern Asia Hindi, Bengali, Urdu

Performance

image.png

image.png

image.png

image.png

License

All models with the exception of Qwen2 72B (both instruct and base models) are Apache 2.0 licensed.

Qwen2 72B model still uses the original Qianwen License.