79.5K Downloads Updated 2 years ago
Name
18 models
Size
Context
Input
notux:latest
26GB · 32K context window · Text · 2 years ago
26GB
32K
Text
notux:8x7b
This model is a fine-tuned version of Mixtral using a high-quality, curated dataset. As of Dec 26th 2023, this model is the top ranked MoE (Mixture of Experts) model on the Hugging Face Open LLM Leaderboard.
HuggingFace
Argilla