64.4K Downloads Updated 6 months ago
A family of efficient AI models under 10B parameters performant in science, math, and coding through innovative training techniques.
Models
View all →Name
17 models
falcon3:latest
4.6GB · 32K context window · Text · 6 months ago
falcon3:1b
1.8GB · 8K context window · Text · 6 months ago
falcon3:3b
2.0GB · 32K context window · Text · 6 months ago
falcon3:7b
latest4.6GB · 32K context window · Text · 6 months ago
falcon3:10b
6.3GB · 32K context window · Text · 6 months ago
Readme
Falcon3 represents TII’s latest advancement in efficient language models under 10B parameters, focused on enhancing science, math, and code capabilities while maintaining training efficiency.
Key Features
- Four sizes: 1B, 3B, 7B, 10B
- Depth up-scaling technique used to create 10B model from 7B
- Knowledge distillation for smaller models (1B, 3B)
Performance Highlights
falcon3:1b
outperformssmollm2:1.7b
, matchesgemma2:2b
falcon3:10b
achieves SOTA in under-13B category- Extended context length up to 32K tokens (8K for 1B model)