latest
4.1GB
quantization of OpenPipe/mistral-ft-optimized-1227 - a hierarchichal SLERP merge of teknium/OpenHermes-2.5-Mistral-7B, Intel/neural-chat-7b-v3-3, meta-math/MetaMath-Mistral-7B, and openchat/openchat-3.5-1210
7B
65 Pulls Updated 4 months ago
Updated 7 months ago
7 months ago
4a4f568cd391 · 4.1GB
model
archllama
·
parameters7.24B
·
quantizationQ4_0
4.1GB
params
{"num_ctx":8192,"stop":["<|im_start","<|im_end","|im_start","|im_end"]}
82B
template
{{ if .System }}<|im_start|>system
{{ .System }}<|im_end|>
{{ end }}{{ if .Prompt }}<|im_start|>user
{{ .Prompt }}<|im_end|>
{{ end }}<|im_start|>assistant
156B
Readme
No readme