HydraLM

community
Activity Feed

AI & ML interests

None defined yet.

Recent Activity

Hydra-MoE: A new class of Open-Source Mixture of Experts
a SkunkWorks Project
Skunkworks OSS introduces Hydra-MoE, an innovative Mixture of Experts (MoE) architecture that leverages LoRA/QLoRA experts to scale and augment the performance of base language models. The central aim of this research is to transmute any base language model into an advanced, lightweight, and efficient MoE framework, employing swappable QLoRA Expert Adapters, with the objective of achieving performance levels that rival state-of-the-art models and can run on commodity/consumer hardware.


Core Team

Prateek Yadav

Prateek Yadav

Follow @Prateeky2806
Alpay Ariyak

Alpay Ariyak

Follow @AlpayAriyak
Artem Yatsenko

Artem Yatsenko

Follow @Sumo43_
Harrison Kinsley

Harrison Kinsley

Follow @Sentdex
Yaroslav Shipilov

Yaroslav Shipilov

Follow @TheSlavant
And many more contributors!