Transformers

These SAE's were trained on residual outputs of layers using Eleuther's SAE library here: https://github.com/EleutherAI/sae, on a subset of FineWebText given in lukemarks/vader-post-training

You can see the SAE training details at https://huggingface.co/apart/llama3.2_1b_base_saes_vader/blob/main/config.json .

While FVU and dead_pct metrics for each SAE run are saved under the respective layers, e.g., see https://huggingface.co/apart/llama3.2_1b_base_saes_vader/blob/main/model.layers.12/metrics.json

Downloads last month
18
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for apart/llama3.2_1b_base_saes_vader

Finetuned
(397)
this model

Dataset used to train apart/llama3.2_1b_base_saes_vader