MambaSan-370m-instruct 🐍
MambaSan-instruct is the first chat Japanese language model based on a state-space model architecture (Mamba).
The model is based on Albert Gu's and Tri Dao's work Mamba: Linear-Time Sequence Modeling with Selective State Spaces (paper) as well as their model implementation. This work was also inspired by heavenq's mamba-chat implementation in English.
Mamba-Chat is based on MambaSan-370m and was fine-tuned on 31,7k examples samples of the SkelterLabsInc/JaQuAD dataset. To learn more, you can:
- Take a look at the model on Huggingface 🤗
- Talk to Mamba-Chat on Google Colab
The Code used for pretraining and finetuning will soon be published on my github: https://github.com/lcabannes Citation
bibtex @misc{lcabannes2024MambaSan-370m-instruct, title = {MambaSan-370m-instruct}, author = {Loïc Cabannes}, year = {2024}, howpublished = {HuggingFace}, url = {https://huggingface.co/loiccabannes/MambaSan-370m-instruct/} }
- Downloads last month
- 4