The TinyLlama project is an open endeavor to train a compact 1.1B Llama model on 3 trillion tokens.

1.1b

284.7K 10 months ago

fa956ab37b8c · 98B
{
"stop": [
"<|system|>",
"<|user|>",
"<|assistant|>",
"</s>"
]
}