latest
5.7GB
8B
986 Pulls Updated 7 weeks ago
Updated 7 weeks ago
7 weeks ago
236c9d419ae8 · 5.7GB
model
archllama
·
parameters8.03B
·
quantizationQ5_K_M
5.7GB
template
{{ if .System }}<|start_header_id|>system<|end_header_id|>
{{ .System }}<|eot_id|>{{ end }}{{ if .Prompt }}<|start_header_id|>user<|end_header_id|>
{{ .Prompt }}<|eot_id|>{{ end }}<|start_header_id|>assistant<|end_header_id|>
{{ .Response }}<|eot_id|>
260B
system
Answer me in Spanish.
21B
params
{"num_ctx":128000,"stop":["<|start_header_id|>","<|end_header_id|>","<|eot_id|>","<|reserved_special_token"],"temperature":1}
161B
Readme
Llama3.1-Uncensored
llama3.1-uncensored responde en español.