967.7K Downloads Updated 1 year ago
Updated 1 year ago
1 year ago
c41da1e19678 · 23GB
DeepSeek Coder is trained from scratch on both 87% code and 13% natural language in English and Chinese. Each of the models are pre-trained on 2 trillion tokens.
1.3 billion parameter model
ollama run deepseek-coder
6.7 billion parameter model
ollama run deepseek-coder:6.7b
33 billion parameter model
ollama run deepseek-coder:33b
Open the terminal and run ollama run deepseek-coder
Example using curl:
curl -X POST http://localhost:11434/api/generate -d '{
"model": "deepseek-coder",
"prompt":"Why is the sky blue?"
}'