Zephyr is a series of fine-tuned versions of the Mistral and Mixtral models that are trained to act as helpful assistants.
7b
141b
234.3K Pulls Updated 10 months ago
Updated 10 months ago
10 months ago
7bc5b262ff30 · 281GB
model
archllama
·
parameters141B
·
quantizationF16
281GB
params
{
"stop": [
"<|system|>",
"<|user|>",
"<|assistant|>",
"</s>"
98B
template
{{ if .System }}<|system|>
{{ .System }}</s>
{{ end }}{{ if .Prompt }}<|user|>
{{ .Prompt }}</s>
{{
139B
license
Apache License
Version 2.0, January 2004
11kB
Readme
Zephyr is a series of language models that are trained to act as helpful assistants. Zephyr 141B-A35B is the latest model in the series, and is a fine-tuned version of Mixtral 8x22b.
Sizes
zephyr:141b
: A Mixture of Experts (MoE) model with 141B total parameters and 35B active parameters.zephyr:7b
: The original Zephyr model