mixtral:8x7b-instruct-v0.1-q4_K_S

1.3M 8 months ago

A set of Mixture of Experts (MoE) model with open weights by Mistral AI in 8x7b and 8x22b parameter sizes.

tools 8x7b 8x22b
53d74de0d84c · 84B
[INST] {{ if .System }}{{ .System }} {{ end }}{{ .Prompt }} [/INST] {{ .Response }}