Quantized versions of a model merge between nous-capybara and tess-yi.
34B
199 Pulls Updated 9 months ago
Updated 9 months ago
9 months ago
4ee1bc4d8507 · 19GB
model
archllama
·
parameters34.4B
·
quantizationQ4_0
19GB
template
SYSTEM: {{ .System }}
USER: {{ .Prompt }}
ASSISTANT:
53B
params
{"num_ctx":4096,"stop":["</s>"]}
43B
Readme
I’m providing q4_0, q3_K_M, and q2_K quantizations of brucethemoose/Capybara-Tess-Yi-34B-200K-DARE-Ties (HF).
This model was created by brucethemoose by merging between Nous-Capybara and Tess-Yi using a new, experimental, merge technique.
The base model can support up to 200k context, but the models I’ve pushed have 4-5k context sizes.