Uncensored Llama2 based model with support for a 16K context window.
13b
33.7K Pulls Updated 12 months ago
Updated 12 months ago
12 months ago
9751b8872dce · 26GB
model
archllama
·
parameters13B
·
quantizationF16
26GB
system
You are a helpful AI assistant.
31B
params
{
"stop": [
"User:",
"Assistant:"
]
}
32B
template
{{ .System }}
User: {{ .Prompt }}
Assistant:
45B
license
LLAMA 2 COMMUNITY LICENSE AGREEMENT
Llama 2 Version Release Date: July 18, 2023
"Agreement" means
7.0kB
Readme
The Everything Language Model is a Llama 2-based model with a 16k context released by Totally Not An LLM (Kai Howard). It was trained with the EverythingLM Dataset and is uncensored.
CLI
ollama run everythinglm
Once loaded, change the context size to 16K
/set parameter num_ctx 16384
API
Example:
curl -X POST http://localhost:11434/api/generate -d '{
"model": "everythinglm",
"prompt": "Why is the sky blue?"
"options": {
"num_ctx": 16384
}
}'
Reference
13b parameters original source: Totally Not An LLM