latest
783MB
1B
172 Pulls Updated 8 months ago
Updated 8 months ago
8 months ago
bbc433e11a7b · 783MB
model
archllama
·
parameters1.10B
·
quantizationQ5_K_M
783MB
system
You are a helpful AI assistant.
31B
template
<|system|>
{{ .System }}</s>
<|user|>
{{ .Prompt }}</s>
<|assistant|>
70B
params
{"stop":["<|system|>","<|user|>","<|assistant|>","</s>"],"temperature":0.7}
116B
Readme
The TinyLlama project aims to pretrain a 1.1B Llama model on 3 trillion tokens. The training has started on 2023-09-01.
It has adopted exactly the same architecture and tokenizer as Llama 2. This means TinyLlama can be plugged and played in many open-source projects built upon Llama. Besides, TinyLlama is compact with only 1.1B parameters. This compactness allows it to cater to a multitude of applications demanding a restricted computation and memory footprint.