
maximuspowers/bert-philosophy-classifier
Fill-Mask
•
0.1B
•
Updated
•
71
A study benchmarking major LLM's internalized philosophies
Note A school-of-philosophy text-classifier model, trained with simultaneous binary cross entropy and contrastive loss.
Note bert-base-uncased trained on the Stanford Encyclopedia of Philosophy using masked language modeling, with the intention of domain adaptation to philosophical language.
Note Short summaries of academic papers from PhilPapers, generated by llama-3.1-8B, labeled with 17 schools of philosophy (multi-label).
Note Synthetic text snippets generated by llama-3.1-8B with labels representing various philosophical viewpoints, with labels.