--- library_name: transformers license: apache-2.0 base_model: Heralax/philosophy-llm-mistral-pretrain tags: - generated_from_trainer model-index: - name: philosophy-hardcore-pretraining results: [] --- # Philosophy LLM I would've trained this on Phi so I could've called it Phi-losophy if I had thought of that joke before kicking off the run. Oh well. It's trained on Mistral instead. That's a Mist opportunity right there. This is a narrow domain-expert LLM trained on the top 5 books on Gutenberg: - The Problems of Philosophy (Bertrand Russell) - Beyond Good and Evil (Nietzsche) - Thus Spake Zarathustra: A Book for All and None (Nietzsche) - The Prince (Machiavelli) - Second Treatise of Government It's meant to be an interesting novelty, showing off training on a specific domain. I also forgot to include any generalist assistant data so it's not likely to be good at much else besides answering philosophy questions. ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 2 - eval_batch_size: 1 - seed: 42 - distributed_type: multi-GPU - num_devices: 6 - gradient_accumulation_steps: 6 - total_train_batch_size: 72 - total_eval_batch_size: 6 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 136 - num_epochs: 6 ### Framework versions - Transformers 4.45.0.dev0 - Pytorch 2.3.1+cu121 - Datasets 2.21.0 - Tokenizers 0.19.1