mixtral:8x7b-instruct-v0.1-q4_K_S

1.3M 8 months ago

A set of Mixture of Experts (MoE) model with open weights by Mistral AI in 8x7b and 8x22b parameter sizes.

tools 8x7b 8x22b
ed11eda7790d · 30B
{
"stop": [
"[INST]",
"[/INST]"
]
}