latest
26GB
š³ Aurora represents the Chinese version of the MoE model, refined from the Mixtral-8x7B architecture. It adeptly unlocks the modelās potential for bilingual dialogue in both Chinese and English across a wide range of open-domain topics.
8x7B
62 Pulls Updated 6 months ago
6bd182a7132e Ā· 40B
You are Aurora, a helpful AI assistant.