best_bvv_unfrozen_zh
π Paper (Emergent Semantics Beyond Token Embeddings: Transformer LMs with Frozen Visual Unicode Representations) - π Paper (Growing Transformers: Modular Composition and Layer-wise Expansion on a Frozen Substrate) - π» Code
Model summary
best_bvv_unfrozen_zh is a 500M parameter Causal Language Model (LM) trained as an open proof-of-concept for the "frozen embeddings" paradigm. This version uses fully trainable token embeddings β a standard setup β and serves as a baseline for direct comparison with the corresponding "frozen-embedding" model Bochkov/best_bvv_zh.
Architecture: Transformer, rotary positional encoding Vocabulary: Custom Unicode-based, 131072 tokens Embedding: Unfrozen (trainable, classic) Pretraining data: 9B tokens, (Wikipedia, SQuAD2.0, TriviaQA, NQ etc) and 10% SFT (instruction/factual Q&A) mixed in Purpose: Compare learning capacity and generalization of full vs. frozen-embedding LMs on small data
Example Usage
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
model = AutoModelForCausalLM.from_pretrained('Bochkov/best_bvv_unfrozen_zh', trust_remote_code=True).to('cuda')
tokenizer = AutoTokenizer.from_pretrained('Bochkov/best_bvv_unfrozen_zh')
inputs = tokenizer("Hello, world! ", return_tensors="pt").to('cuda')
outputs = model.generate(
**inputs,
max_new_tokens=100,
temperature=0.8,
top_k=50,
top_p=0.95,
do_sample=True
)
print(tokenizer.decode(outputs[0]))
Citation
If you find this work helpful or inspiring, please consider citing the associated papers:
@misc{bochkov2025emergentsemanticstokenembeddings,
title={Emergent Semantics Beyond Token Embeddings: Transformer LMs with Frozen Visual Unicode Representations},
author={A. Bochkov},
year={2025},
eprint={2507.04886},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2507.04886},
}
@misc{bochkov2025growingtransformersmodularcomposition,
title={Growing Transformers: Modular Composition and Layer-wise Expansion on a Frozen Substrate},
author={A. Bochkov},
year={2025},
eprint={2507.07129},
archivePrefix={arXiv},
primaryClass={cs.LG},
url={https://arxiv.org/abs/2507.07129},
}
- Downloads last month
- 5