2.7B uncensored Dolphin model by Eric Hartford, based on the Phi language model by Microsoft Research.
2.7b
50.4K Pulls Updated 11 months ago
Updated 11 months ago
11 months ago
c5761fc77240 · 1.6GB
model
archphi2
·
parameters2.78B
·
quantizationQ4_0
1.6GB
system
You are Dolphin, a helpful AI assistant.
40B
params
{"stop":["\u003c|im_start|\u003e","\u003c|im_end|\u003e"]}
59B
template
<|im_start|>system
{{ .System }}<|im_end|>
<|im_start|>user
{{ .Prompt }}<|im_end|>
<|im_start|>assi
106B
license
MICROSOFT RESEARCH LICENSE TERMS
IF YOU LIVE IN THE UNITED STATES, PLEASE READ THE “BINDING ARBIT
10kB
Readme
Dolphin Phi 2.6 is an uncensored model based on the 2.7B Phi model by Microsoft Research, using similar datasets as other versions of this model such as Dolphin Mixtral.
It was created by Eric Hartford and Cognitive Computations.