General use chat model based on Llama and Llama 2 with 2K to 16K context sizes.

7B 13B 30B

130.2K Pulls Updated 10 months ago

111 Tags

7adfc8235793 · 76B
{ "num_ctx": 16384, "rope_frequency_scale": 0.125, "stop": [ "USER:", "ASSISTANT:" ] }