An open-source Mixture-of-Experts code language model that achieves performance comparable to GPT4-Turbo in code-specific tasks.
16b
236b
437.3K Pulls Updated 3 months ago
Updated 3 months ago
3 months ago
6ce352e55f26 · 194GB
model
archdeepseek2
·
parameters236B
·
quantizationQ6_K
194GB
params
{
"stop": [
"User:",
"Assistant:"
]
}
32B
template
{{- if .Suffix }}<|fim▁begin|>{{ .Prompt }}<|fim▁hole|>{{ .Suffix }}<|fim▁end|>
{{
705B
license
MIT License
Copyright (c) 2023 DeepSeek
Permission is hereby granted, free of charge, to any perso
1.1kB
license
DEEPSEEK LICENSE AGREEMENT
Version 1.0, 23 October 2023
Copyright (c) 2023 DeepSeek
Section I: PR
14kB
Readme
DeepSeek-Coder-V2 is an open-source Mixture-of-Experts (MoE) code language model that achieves performance comparable to GPT4-Turbo in code-specific tasks. DeepSeek-Coder-V2 is further pre-trained from DeepSeek-Coder-V2-Base with 6 trillion tokens sourced from a high-quality and multi-source corpus.