Llama-3-Taiwan-70B is a 70B parameter model finetuned on a large corpus of Traditional Mandarin and English data using the Llama-3 architecture. It demonstrates state-of-the-art performance on various Traditional Mandarin NLP benchmarks.

70B

184 Pulls Updated 2 months ago

1 Tag
4b05ee64ec2b • 43GB • Updated 2 months ago