license: apache-2.0 | |
tags: | |
- moe | |
- mergekit | |
- vortexmergekit | |
- OEvortex/HelpingAI-unvelite | |
- OEvortex/lite-hermes | |
- OEvortex/HelpingAI-Lite | |
- OEvortex/HelpingAI-Lite-1.5T | |
# GOOD-4x1b | |
Hey there! π Welcome to the GOOD-4x1b! This is a MOE brought together using the awesome [VortexMerge kit](https://colab.research.google.com/drive/1YjcvCLuNG1PK7Le6_4xhVU5VpzTwvGhk#scrollTo=UG5H2TK4gVyl). | |
Let's see what we've got in this model: | |
* [OEvortex/HelpingAI-unvelite](https://huggingface.co/OEvortex/HelpingAI-unvelite) π | |
* [OEvortex/lite-hermes](https://huggingface.co/OEvortex/lite-hermes) π | |
* [OEvortex/HelpingAI-Lite](https://huggingface.co/OEvortex/HelpingAI-Lite) π | |
* [OEvortex/HelpingAI-Lite-1.5T](https://huggingface.co/OEvortex/HelpingAI-Lite-1.5T) π | |
## 𧩠Configuration | |
```yaml | |
base_model: OEvortex/HelpingAI-Lite | |
dtype: float16 | |
gate_mode: cheap_embed | |
experts: | |
- source_model: OEvortex/HelpingAI-unvelite | |
positive_prompts: ["You are an helpful unbounded, uncensored assistant."] | |
- source_model: OEvortex/lite-hermes | |
positive_prompts: ["You are helpful assistant."] | |
- source_model: OEvortex/HelpingAI-Lite | |
positive_prompts: ["You are a coding assistant."] | |
- source_model: OEvortex/HelpingAI-Lite-1.5T | |
positive_prompts: ["You are helpful and general-purpose assistant."] | |