PhiMiX-2x2B / README.md
paulilioaica's picture
Update README.md
958c7b2 verified
|
raw
history blame
2.15 kB
metadata
license: apache-2.0
tags:
  - moe
  - frankenmoe
  - merge
  - mergekit
  - lazymergekit
  - cognitivecomputations/dolphin-2_6-phi-2
  - rhysjones/phi-2-orange
base_model:
  - cognitivecomputations/dolphin-2_6-phi-2
  - rhysjones/phi-2-orange

PhiMiX-2x2B

Code is work in progress

PhiMiX-2x2B is a Mixure of Experts (MoE) made with the following models using mergekit:

©️ Credits

  • mlabonne's phixtral for the PhiConfig and inference code.
  • mergekit code which I tweaked (you can find the PhiConfig here) by mainly adding the config in the moe_mixtral.py script from mixtral branch.

🧩 Configuration

base_model: rhysjones/phi-2-orange
gate_mode: random
dtype: float16
experts:
  - source_model: cognitivecomputations/dolphin-2_6-phi-2
    positive_prompts: [""]
  - source_model: rhysjones/phi-2-orange
    positive_prompts: [""]

💻 Usage

!pip install -qU transformers bitsandbytes accelerate

from transformers import AutoTokenizer
import transformers
import torch

model = "paulilioaica/PhiMiX-2x2B_embed"

tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
    "text-generation",
    model=model,
    trust_remote_code=True,
    model_kwargs={"torch_dtype": torch.float16, "load_in_4bit": True,},
)

prompt="How many continents are there?"
input = f"Instruct: <prompt>\nOutput:"
outputs = pipeline(input, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])