NeuralStar_FusionWriter_4x7b is a Mixture of Experts (MoE) made with the following models using LazyMergekit:
31 Pulls Updated 2 months ago
Updated 2 months ago
2 months ago
4d89ee3fd09e · 17GB
Readme
Source: https://huggingface.co/OmnicromsBrain/NeuralStar_FusionWriter_4x7b
NeuralStar_FusionWriter_4x7b
NeuralStar_FusionWriter_4x7b is a Mixture of Experts (MoE) made with the following models using LazyMergekit:
* mlabonne/AlphaMonarch-7B
* OmnicromsBrain/Eros_Scribe-7b
* SanjiWatsuki/Kunoichi-DPO-v2-7B
* OmnicromsBrain/NeuralStar_Fusion-7B
⚡ Quantized Models
Special thanks to MRadermacher for the static and imatrix quantized models
.GGUF https://huggingface.co/mradermacher/NeuralStar_FusionWriter_4x7b-GGUF
IMatrix https://huggingface.co/mradermacher/NeuralStar_FusionWriter_4x7b-i1-GGUF