instruct-igel-001 / README.md
philschmid's picture
philschmid HF staff
update readme
74ce4a1
|
raw
history blame
1.72 kB
---
language:
- de
pipeline_tag: text-generation
library_name: transformers
tags:
- bloom
- LLM
inference: false
widget:
- text: "TODO"
---
<div style="text-align:center;width:250px;height:250px;">
<img src="https://huggingface.co/philschmid/instruct-igel-001/resolve/main/assets/igel_removedbg.png" alt="IGEL logo"">
</div>
# IGEL: Instruction-tuned German large Language Model for Text
IGEL is a LLM model family developed for German. The first version of IGEL is built on top [BigScience BLOOM](https://bigscience.huggingface.co/blog/bloom) adapted to [German from Malte Ostendorff](https://huggingface.co/malteos/bloom-6b4-clp-german). IGEL designed to provide accurate and reliable language understanding capabilities for a wide range of natural language understanding tasks, including sentiment analysis, language translation, and question answering.
### You can try out the model at [igel-playground]().
The IGEL family includes instruction `instruct-igel-001` and `chat-igel-001` _coming soon_.
## Model Description
LoRA tuned [BLOOM-CLP German (6.4B parameters)](https://huggingface.co/malteos/bloom-6b4-clp-german) with merged weights.
## Training data
`instruct-igel-001` is trained on naive translated instruction datasets, without much post-processing.
### Known limitations
`instruct-igel-001` also exhibits several common deficiencies of language models, including hallucination, toxicity, and stereotypes.
For example, in the following figure, `instruct-igel-001` wrongly says that the cancelor of Germany is Angela Merkel.
![cancelor](./assets/cancelor.png)
### Training procedure
_coming soon_
## How to use
You can test the model in this LLM playground.
_coming soon_