πΌ TallyPrimeAssistant β Distilled GPT-2 Model
This is a distilled GPT-2-based conversational model fine-tuned on FAQs and navigation instructions from TallyPrime, a leading business accounting software used widely in India. The model is designed to help users get quick and accurate answers about using features in TallyPrime like GST, e-invoicing, payroll, and more.
π§ Model Summary
- Teacher Model:
gpt2-large
- Student Model:
distilgpt2
- Distillation Method: Knowledge Distillation using Hugging Face's Transformers and custom training pipeline
- Training Dataset: Internal dataset of Q&A pairs and system navigation steps from TallyPrime documentation and usage
- Format:
safetensors
(secure and fast) - Tokenizer: Byte-Pair Encoding (BPE), same as GPT-2
π Example Usage
from transformers import AutoTokenizer, AutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("Jayanthram/TallyPrimeAssistant")
tokenizer = AutoTokenizer.from_pretrained("Jayanthram/TallyPrimeAssistant")
prompt = "How to enable GST in Tally Prime?"
inputs = tokenizer(prompt, return_tensors="pt")
output = model.generate(**inputs, max_new_tokens=60)
print(tokenizer.decode(output[0], skip_special_tokens=True))
- Downloads last month
- 43
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
π
Ask for provider support