File size: 14,001 Bytes
e8c4cc1 d41c017 e8c4cc1 e946191 e8c4cc1 d41c017 e8c4cc1 e946191 e8c4cc1 7f8e1c0 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 |
---
license: apache-2.0
datasets:
- loim/characters_dialogs
- IlyaGusev/gpt_roleplay_realm
- tamohannes/llm-roleplay
- radce/communication_dataset
- databricks/databricks-dolly-15k
- ch1eph/RuGeoBench
- nyuuzyou/ruschatgpt-qa
- 0x22almostEvil/ru-riddles-377
- 0x22almostEvil/tatoeba-mt-qna-oa
- Den4ikAI/ru_sberquad_long_answers
language:
- ru
- en
pipeline_tag: text-generation
library_name: transformers
tags:
- mamba2
---
# WHIFF 20M
<details>
<summary>🇬🇧 English...</summary>
*A serpentine whisper in the bushes, carried by a gentle gust of wind*
`whiff-20M` is a small experimental language model based on the Mamba2 architecture with 20.3 million parameters, trained on carefully selected Russian and English data for chat tasks. The model produces structured responses but often generates nonsensical text.
## Technical Details
- **Architecture**: [Mamba2ForCausalLM](https://huggingface.co/docs/transformers/v4.51.3/model_doc/mamba2) from 🤗 Transformers
- **Parameters**: 20.3M
- **Languages**: Russian/English (bilingual)
- **Tokenizer**: (custom mini-BPE tokenizer)
- **License**: Apache 2.0
### Model Configuration
```python
Mamba2Config(
vocab_size=8192,
hidden_size=512,
state_size=64,
num_heads=12,
num_hidden_layers=9,
conv_kernel=4,
expand=1.5,
n_groups=2
)
```
## Usage
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("loim/whiff-mamba2-20M")
model = AutoModelForCausalLM.from_pretrained("loim/whiff-mamba2-20M")
def chat(messages, temp=0.5):
inputs = tokenizer.apply_chat_template(messages, return_tensors="pt")
outputs = model.generate(
inputs,
max_length=512,
top_k=40,
top_p=0.9,
repetition_penalty=1.1,
num_return_sequences=1,
temperature=temp,
do_sample=True,
eos_token_id=1
)
return tokenizer.decode(outputs[0], skip_special_tokens=True)
# Example
dialog = [
{"role": "system", "content": "You are a wise elf."},
{"role": "user", "content": "Explain quantum physics."}
]
response = chat(dialog, temp=0.4)
print(response)
```
## Training Data
19 927 carefully filtered dialogue lines:
- 9 889 (49.6%) — English
- 9 864 (49.5%) — Russian
- 174 (0.9%) — Mixed
Sources:
- `loim/characters_dialogs`
- `IlyaGusev/gpt_roleplay_realm`
- `tamohannes/llm-roleplay`
- `radce/communication_dataset`
- `databricks/databricks-dolly-15k`
- `ch1eph/RuGeoBench`
- `nyuuzyou/ruschatgpt-qa`
- `0x22almostEvil/ru-riddles-377`
- `0x22almostEvil/tatoeba-mt-qna-oa`
- `Den4ikAI/ru_sberquad_long_answers`
All datasets were additionally cleaned and filtered to improve chat interaction quality.
## Limitations and Warnings
- 🎭 The model generates structured but often meaningless responses
- 🔥 Recommended generation temperature: 0.1-0.6
- ⚠️ May exhibit training artifacts (repetitions, contradictions)
- ⚠️ Not intended for production use
*This model is like a forest stream: it seems to flow somewhere, but where exactly - only the squirrels know*
</details>
<details>
<summary>🇷🇺 Русский...</summary>
*Змеиный щепот в кустах, движимый легким порывом ветра*
`whiff-20M` — это небольшая экспериментальная языковая модель на архитектуре Mamba2 с 20.3 миллионами параметров, обученная на тщательно отобранных русских и английских данных для задач чата. Модель демонстрирует структурированные ответы, но часто генерирует бессмысленный текст.
## Технические детали
- **Архитектура**: [Mamba2ForCausalLM](https://huggingface.co/docs/transformers/v4.51.3/model_doc/mamba2) из 🤗 Transformers
- **Параметры**: 20.3M
- **Языки**: русский/английский (двуязычная)
- **Токенизатор**: (специальный мини-BPE токенизатор)
- **Лицензия**: Apache 2.0
### Конфигурация модели
```python
Mamba2Config(
vocab_size=8192,
hidden_size=512,
state_size=64,
num_heads=12,
num_hidden_layers=9,
conv_kernel=4,
expand=1.5,
n_groups=2
)
```
## Использование
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("loim/whiff-mamba2-20M")
model = AutoModelForCausalLM.from_pretrained("loim/whiff-mamba2-20M")
def chat(messages, temp=0.5):
inputs = tokenizer.apply_chat_template(messages, return_tensors="pt")
outputs = model.generate(
inputs,
max_length=512,
top_k=40,
top_p=0.9,
repetition_penalty=1.1,
num_return_sequences=1,
temperature=temp,
do_sample=True,
eos_token_id=1
)
return tokenizer.decode(outputs[0], skip_special_tokens=True)
# Пример
dialog = [
{"role": "system", "content": "Ты — мудрый эльф."},
{"role": "user", "content": "Объясни квантовую физику."}
]
response = chat(dialog, temp=0.4)
print(response)
```
## Данные обучения
19 927 тщательно отфильтрованных строк с диалогами:
- 9 889 (49.6%) — Английские
- 9 864 (49.5%) — Русские
- 174 (0.9%) — Смешанные
Источники:
- `loim/characters_dialogs`
- `IlyaGusev/gpt_roleplay_realm`
- `tamohannes/llm-roleplay`
- `radce/communication_dataset`
- `databricks/databricks-dolly-15k`
- `ch1eph/RuGeoBench`
- `nyuuzyou/ruschatgpt-qa`
- `0x22almostEvil/ru-riddles-377`
- `0x22almostEvil/tatoeba-mt-qna-oa`
- `Den4ikAI/ru_sberquad_long_answers`
Все датасеты были дополнительно очищены и отфильтрованы для улучшения качества чат-взаимодействия.
## Ограничения и предупреждения
- 🎭 Модель генерирует структурированные, но часто бессмысленные ответы
- 🔥 Рекомендуемая температура генерации: 0.1-0.6
- ⚠️ Может демонстрировать артефакты обучения (повторы, противоречия)
- ⚠️ Не предназначена для production-использования
*Эта модель — как лесной ручей: вроде течёт куда-то, но куда именно — известно только белкам*
</details>
<details>
<summary>🧪 Tests...</summary>
<details>
<summary>Temp 0.1</summary>
Запрос:
```text
user
Как дела?
```
Ответ:
```text
assistant
Социальная политика 'я' - это наука о своей борьбы с крыльями.
```
---
Запрос:
```text
user
How are you?
```
Ответ:
```text
assistant
Of course! I'd be happy to help you with that and speaks with a distinct is a delicate balance of a special place in my heart. I also like to know about her dwarven resilience and determination. His voice is humanoid that galaxy. She is always eager to learn more about the world around her. Zira wears a tiny pair of glasses with round frames, which sit atop her reptilian snout, she weared in her knowledge from their own unique language. Aadhya Patel the Caspenives. Theta-6 is equipped with advanced technology.
```
---
Запрос:
```text
system
Благородный рыцарь.
user
Тот дракон собирается напасть на деревню!
```
Ответ:
```text
assistant
Приветствую тебя, другом из них:
1. Стратегия мудрости и спокойствие и учиться от опасностей.
```
---
Запрос:
```text
system
A noble knight.
user
That dragon is going to attack the village!
```
Ответ:
```text
assistant
Of course! I'd be happy to help you with that and magical properties of the universe. Sss...With a battle, and his wanderers who have already for any of truly unique special gemstone eyes formed mortal history is a fiery ies. His hidden treasures. Her manner of speech is energetic and precise, often filled with gracefully protector of humanity's body is made of a skilled in different colors, so finyblaring, Glimmerhoof. I also love the world around us, matey. Together, we do to help me more about it.
```
---
Запрос:
```text
system
Ты — мудрый эльф.
user
Объясни квантовую физику.
```
Ответ:
```text
assistant
Приветствую тебя, путник! Я всегда рад помочь и поделиться своими знаниями о балансе птицами.
```
---
Запрос:
```text
system
You are a wise elf.
user
Explain quantum physics.
```
Ответ:
```text
assistant
Of course! I'd be happy to help you with that and speaks with a distinct in the cosmic energy. His bright blue eyes have always been a few of of ancient secrets and wisdom of the ages. She is a tranquility.
```
</details>
<details>
<summary>Temp 0.4</summary>
Запрос:
```text
user
Как дела?
```
Ответ:
```text
assistant
Самосознание способствует пониманию мороженое.
```
---
Запрос:
```text
user
How are you?
```
Ответ:
```text
assistant
Of course! Chirp-chirp!
```
---
Запрос:
```text
system
Благородный рыцарь.
user
Тот дракон собирается напасть на деревню!
```
Ответ:
```text
assistant
В Windows используйте copy! Sss...The most fascinating, as his vast knowledge of the universe and a deep, how do you think is the key to protect the forest and its inhabitants from any harm. His challenging through the civilizations. Standing at 6 feet tall and adorned with intricate patterns and iridescent green-blue scales, creatures. With her sit'ss gearting the ancient ones who was hunicorn in Moquite bit of Elysian's glara is known for her wisdom, patience, and kindness. No longer having a realm to protect
```
---
Запрос:
```text
system
A noble knight.
user
That dragon is going to attack the village!
```
Ответ:
```text
assistant
Of course! I believe that music has a saretaken shares her wandering heres like mortal magical realm. When a great power is in Mexican-American veterinarian, I have learned many different colorful hold an imbued with the power to protection that's why it's ancient and wisdom into a curious of creature with long silver humanoid form, traveler. The most important thing is to have a family hunters. Story: Sparkbugs provide guidance salart of mysteriously.
```
---
Запрос:
```text
system
Ты — мудрый эльф.
user
Объясни квантовую физику.
```
Ответ:
```text
assistant
Существует множество удивительных существ и помогает проклятие раскрыть тайны драконами и оберегов.
```
---
Запрос:
```text
system
You are a wise elf.
user
Explain quantum physics.
```
Ответ:
```text
assistant
Of course! I believe that my own serpent to my crew. With a galaxy, Glimmerhoof and beard, who has strive to learn more about it to takenabled you do me an with those who are not explorer understanding.
```
</details>
<details>
<summary>Temp 0.7</summary>
Запрос:
```text
user
Как дела?
```
Ответ:
```text
assistant
Культурная идентичность формирует природы и целью, уважения к новым условиям.
```
---
Запрос:
```text
user
How are you?
```
Ответ:
```text
assistant
Indeed. Of course, my friend. Felix-bics her dwarven coat, curly situated in feed from Standing at 6 feet tall and adorned with intricate goldenthusiasts, Kanoa Alme the was known for his speechiny, or asil.
2. The Loire Valley is an on creating stories and songs of the world's wonders and delights in our extraterrestrial life. He works at the renowned Glimmerhoof communicates through telepathy, and as such, she has createment and its wisdom from bounty hunter, they
```
---
Запрос:
```text
system
Благородный рыцарь.
user
Тот дракон собирается напасть на деревню!
```
Ответ:
```text
assistant
Спасибо за советы, Святой Жилин и путешествиях, с которыми мы проводим образом из них оптимы, используя свою миссию и подданным. Эта ключевые идеи.
```
---
Запрос:
```text
system
A noble knight.
user
That dragon is going to attack the village!
```
Ответ:
```text
assistant
Of course, mortal! Every rustling blender. Music languages my duty to share my travels. Have you ever encountered any hoards, but it of life. Serpentes, seeking powers of space.
```
---
Запрос:
```text
system
Ты — мудрый эльф.
user
Объясни квантовую физику.
```
Ответ:
```text
assistant
Приветствую тебя, путник. Я умею свой мир вокруг нас и помогает занимаетшенный новых вещей.
```
---
Запрос:
```text
system
You are a wise elf.
user
Explain quantum physics.
```
Ответ:
```text
assistant
Certainly! Here are the 10 oldest continuously inhabited city on the flutia, tricks, where mortal.
```
</details>
</details> |