Fine-tuned Llama 2 model to answer medical questions based on an open source medical dataset.
7b
39.9K Pulls Updated 12 months ago
Updated 12 months ago
12 months ago
7f1509d663e8 · 4.8GB
model
archllama
·
parameters6.74B
·
quantizationQ5_K_M
4.8GB
params
{"stop":["User:","Assistant:"]}
31B
template
{{ .System }}
User: {{ .Prompt }}
Assistant:
45B
Readme
MedLlama2 by Siraj Raval is a Llama 2-based model trained with MedQA dataset to be able to provide medical answers to questions. It is not intended to replace a medical professional, but to provide a starting point for further research.
CLI
Open the terminal and run ollama run medllama2
API
Example:
curl -X POST http://localhost:11434/api/generate -d '{
"model": "medllama2",
"prompt":"A 35-year-old woman presents with a persistent dry cough, shortness of breath, and fatigue. She is initially suspected of having asthma, but her spirometry results do not improve with bronchodilators. What could be the diagnosis?"
}'
Memory requirements
- 7b models generally require at least 8GB of RAM