File size: 5,566 Bytes
7f751aa
 
 
 
 
 
 
 
 
 
 
 
 
 
 
958c7b2
7f751aa
958c7b2
 
 
 
 
 
 
 
7f751aa
 
 
958c7b2
 
 
 
 
 
3eb8d18
 
 
 
97c6fa6
3eb8d18
 
5d01d9a
3eb8d18
 
 
 
958c7b2
7f751aa
 
 
 
2e80735
7f751aa
 
 
2e80735
7f751aa
2e80735
7f751aa
 
 
 
 
 
 
 
 
 
 
4fcc822
7f751aa
 
 
 
 
958c7b2
 
7f751aa
 
958c7b2
d060a5b
958c7b2
7f751aa
958c7b2
4aecf9a
 
 
 
 
 
 
 
 
 
 
 
 
e298d48
 
 
 
 
 
 
 
 
 
 
 
 
 
ae3dbb5
e298d48
 
 
 
 
ae3dbb5
e298d48
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
---
license: apache-2.0
tags:
- moe
- frankenmoe
- merge
- mergekit
- lazymergekit
- cognitivecomputations/dolphin-2_6-phi-2
- rhysjones/phi-2-orange
base_model:
- cognitivecomputations/dolphin-2_6-phi-2
- rhysjones/phi-2-orange
---

# PhiMiX-2x2B



<p align="center">
<img src="https://substackcdn.com/image/fetch/f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11201acc-4089-416d-921b-cbd71fbf8ddb_1024x1024.jpeg" width="500" class="center"/>
</p>


PhiMiX-2x2B is a Mixure of Experts (MoE) made with the following models using mergekit:
* [cognitivecomputations/dolphin-2_6-phi-2](https://huggingface.co/cognitivecomputations/dolphin-2_6-phi-2)
* [rhysjones/phi-2-orange](https://huggingface.co/rhysjones/phi-2-orange)


## ©️ Credits
* [mlabonne's phixtral](https://huggingface.co/mlabonne/phixtral-4x2_8) for the PhiConfig and inference code.
* [mergekit](https://github.com/cg123/mergekit) code which I tweaked (you can find the PhiConfig [here](https://github.com/cg123/mergekit/blob/508348ae34be17ea0a95d0a288a6e34491a2558a/mergekit/architecture.py#L289))
by mainly adding the config in the `moe_mixtral.py` script from `mixtral` branch.

## ⏱️ Benchmarks

|                             Model                              |AGIEval|GPT4All|TruthfulQA|Bigbench|Average|
|----------------------------------------------------------------|------:|------:|---------:|-------:|------:|
|[**PhiMiX-2x2B**](https://huggingface.co/paulilioaica/PhiMiX-2x2B) | **33.34** | **71.75** |  49.25 |   **37.62**  |  **47.99**  |
|[phixtral-4x2_8](https://huggingface.co/mlabonne/phixtral-4x2_8)|  33.91|  70.44|     48.78|   37.68|   47.7|
|[phixtral-2x2_8](https://huggingface.co/mlabonne/phixtral-2x2_8)|   34.1|  70.44|     48.78|   37.82|  47.78|
|[_phi-2-orange_](https://huggingface.co/rhysjones/phi-2-orange)|  _33.37_|  _71.33_|  _49.87_ |  _37.3_ | _47.97_ |
|[_dolphin-2_6-phi-2_](https://huggingface.co/cognitivecomputations/dolphin-2_6-phi-2)|  _33.12_|  _69.85_|     _47.39_|    _37.2_|  _46.89_|

I have used __bold__ to highlight this merge from the list, and _italics_ to highlight it's base modes used in the merge,
and then __bold__ in the cells where it exceeds the performance of either.

## 🧩 Configuration

```yaml
base_model: rhysjones/phi-2-orange
gate_mode: cheap_embed
dtype: float16
experts:
  - source_model: cognitivecomputations/dolphin-2_6-phi-2
    positive_prompts: ["research, logic, math, science"]
  - source_model: rhysjones/phi-2-orange
    positive_prompts: ["programming, reasoning"]
```

## 💻 Usage

```python
!pip install -qU transformers bitsandbytes accelerate

from transformers import AutoTokenizer
import transformers
import torch

model = "paulilioaica/PhiMiX-2x2B"

tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
    "text-generation",
    model=model,
    trust_remote_code=True,
    model_kwargs={"torch_dtype": torch.float16, "load_in_4bit": True,},
)

prompt="How many continents are there?"
input = f"Instruct: {prompt}\nOutput:"
outputs = pipeline(input, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])

```

```terminal
Instruct: How many continents are there?
Output: There are seven continents: Africa, Antarctica, Asia, Europe, North America, Australia, and South America. The total number of continents on Earth is seven, including Antarctica, which is sometimes considered part of the continent of Antarctica or as its own continent.

It's important to note that the number of continents in popular education and geography is seven, but some sources may include Antarctica as its own continent, while others include it as part of the continent of Antarctica. Regardless of the exact categorization, there are seven continents that collectively make up the Earth's landmass.

The continents can be divided into several subregions, such as islands, archipelagos, and microcontinents, which are smaller land masses surrounded by water. These subregions can be considered part of the continents or their own unique entities, depending on the context.

Each continent has its own unique geography, climate, flora, fauna, and human cultures. The continents are interconnected through various landforms, bodies of water, and global trade routes.

In summary, there are seven continents on Earth, each with its own distinct characteristics and unique contributions to the world's diversity. While the number may vary depending on the categorization of Antarctica, all seven continents together make
```


## ♻️ Replicate this repo

**beware** this will only work with 2 phis, you might have to tinker in the naming thing for more layers

**AFTER** all the file modifications and run, you need to replace `configs.json` with the one from **this repo**
**AFTER** that you need to add `modeling_phi.py` and `configurations.phi` from **this repo** to your repo

Steps

1. Modify moe_mixtral.py from `/content/mergekit/mergekit/scripts/mixtral_moe.py` to your hf repo

[***mixtral_moe.py***](https://github.com/paulilioaica/Phi-MOE/blob/main/mixtral_moe.py)


2. Modify  architecture.py `/content/mergekit/mergekit/architecture.py`
(this you can take from the link to the commit i have in description)

[***architecture.py***](https://github.com/paulilioaica/Phi-MOE/blob/main/architecture.py)


3) replace `configs.json` with the one from **this repo**
4) you need to add `modeling_phi.py` and `configurations.phi` from **this repo** to your repo