PhiMiX-2x2B / README.md
paulilioaica's picture
Update README.md
97c6fa6 verified
|
raw
history blame
4.62 kB
metadata
license: apache-2.0
tags:
  - moe
  - frankenmoe
  - merge
  - mergekit
  - lazymergekit
  - cognitivecomputations/dolphin-2_6-phi-2
  - rhysjones/phi-2-orange
base_model:
  - cognitivecomputations/dolphin-2_6-phi-2
  - rhysjones/phi-2-orange

PhiMiX-2x2B

PhiMiX-2x2B is a Mixure of Experts (MoE) made with the following models using mergekit:

©️ Credits

  • mlabonne's phixtral for the PhiConfig and inference code.
  • mergekit code which I tweaked (you can find the PhiConfig here) by mainly adding the config in the moe_mixtral.py script from mixtral branch.

⏱️ Benchmarks

Model AGIEval GPT4All TruthfulQA Bigbench Average
PhiMiX-2x2B 33.34 71.75 49.25 37.62 47.99
phixtral-4x2_8 33.91 70.44 48.78 37.68 47.7
phixtral-2x2_8 34.1 70.44 48.78 37.82 47.78
phi-2-orange 33.37 71.33 49.87 37.3 47.97
dolphin-2_6-phi-2 33.12 69.85 47.39 37.2 46.89

I have used bold to highlight this merge from the list, and italics to highlight it's base modes used in the merge, and then bold in the cells where it exceeds the performance of either.

🧩 Configuration

base_model: rhysjones/phi-2-orange
gate_mode: cheap_embed
dtype: float16
experts:
  - source_model: cognitivecomputations/dolphin-2_6-phi-2
    positive_prompts: ["research, logic, math, science"]
  - source_model: rhysjones/phi-2-orange
    positive_prompts: ["programming, reasoning"]

💻 Usage

!pip install -qU transformers bitsandbytes accelerate

from transformers import AutoTokenizer
import transformers
import torch

model = "paulilioaica/PhiMiX-2x2B"

tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
    "text-generation",
    model=model,
    trust_remote_code=True,
    model_kwargs={"torch_dtype": torch.float16, "load_in_4bit": True,},
)

prompt="How many continents are there?"
input = f"Instruct: {prompt}\nOutput:"
outputs = pipeline(input, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
Instruct: How many continents are there?
Output: There are seven continents: Africa, Antarctica, Asia, Europe, North America, Australia, and South America. The total number of continents on Earth is seven, including Antarctica, which is sometimes considered part of the continent of Antarctica or as its own continent.

It's important to note that the number of continents in popular education and geography is seven, but some sources may include Antarctica as its own continent, while others include it as part of the continent of Antarctica. Regardless of the exact categorization, there are seven continents that collectively make up the Earth's landmass.

The continents can be divided into several subregions, such as islands, archipelagos, and microcontinents, which are smaller land masses surrounded by water. These subregions can be considered part of the continents or their own unique entities, depending on the context.

Each continent has its own unique geography, climate, flora, fauna, and human cultures. The continents are interconnected through various landforms, bodies of water, and global trade routes.

In summary, there are seven continents on Earth, each with its own distinct characteristics and unique contributions to the world's diversity. While the number may vary depending on the categorization of Antarctica, all seven continents together make