Feature Extraction
Safetensors
German
llm2vec
llama
masked-lm
long-context

LLäMmlein2Vec 120M

LLäMmlein2Vec 120M is a German encoder language model derived from our German decoder-only model LLäMmlein 120M via LLM2Vec.

We provide three transformed models:

Find more details in our preprint!

Usage

You can use LLäMmlein2Vec with the llm2vec library.

import torch
from llm2vec import LLM2Vec

model_id = "LSX-UniWue/LLaMmlein2Vec_120M"
l2v = LLM2Vec.from_pretrained(
    model_id, 
    device_map="cuda" if torch.cuda.is_available() else "cpu",
    torch_dtype=torch.bfloat16,
)

License

We release the ModernGBERT models under a research-only RAIL-M license. See license.md for details.

Downloads last month
11
Safetensors
Model size
100M params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for LSX-UniWue/LLaMmlein2Vec_120M

Finetuned
(1)
this model

Dataset used to train LSX-UniWue/LLaMmlein2Vec_120M

Collection including LSX-UniWue/LLaMmlein2Vec_120M