latest
11GB
from TheBloke/Mixtral_11Bx2_MoE_19B-GGUF
107 Pulls Updated 7 months ago
Updated 7 months ago
7 months ago
02707069aa46 · 11GB
model
archllama
·
parameters19.2B
·
quantizationQ4_K_M
11GB
params
{"num_ctx":8192,"stop":["### Input","### Response"]}
53B
template
{{ if and .First .System }}### Instruction:
{{ .System }}
{{ end }}
### Input:
{{ .Prompt }}
### Response:
108B
system
You are YiMoeMixtral, a helpful AI assistant.
45B
Readme
MoE of
kyujinpy/Sakura-SOLAR-Instruct
jeonsworld/CarbonVillain-en-10.7B-v1
from https://huggingface.co/cloudyu/Mixtral_11Bx2_MoE_19B
supposedly well performing cf.
https://www.reddit.com/r/LocalLLaMA/comments/1916896/llm_comparisontest_confirm_leaderboard_big_news/
(in my first experiments not performing well though, maybe other template?)