latest
1.6GB
pulled from https://huggingface.co/second-state/Octopus-v2-GGUF/blob/main/Octopus-v2-Q4_K_M.gguf
2B
119 Pulls Updated 5 months ago
Updated 5 months ago
5 months ago
8418043e57a3 · 1.6GB
model
archgemma
·
parameters2.51B
·
quantizationQ4_K_M
1.6GB
Readme
No readme