YAML Metadata
Warning:
The pipeline tag "text2text-generation" is not in the official list: text-classification, token-classification, table-question-answering, question-answering, zero-shot-classification, translation, summarization, feature-extraction, text-generation, fill-mask, sentence-similarity, text-to-speech, text-to-audio, automatic-speech-recognition, audio-to-audio, audio-classification, audio-text-to-text, voice-activity-detection, depth-estimation, image-classification, object-detection, image-segmentation, text-to-image, image-to-text, image-to-image, image-to-video, unconditional-image-generation, video-classification, reinforcement-learning, robotics, tabular-classification, tabular-regression, tabular-to-text, table-to-text, multiple-choice, text-ranking, text-retrieval, time-series-forecasting, text-to-video, image-text-to-text, visual-question-answering, document-question-answering, zero-shot-image-classification, graph-ml, mask-generation, zero-shot-object-detection, text-to-3d, image-to-3d, image-feature-extraction, video-text-to-text, keypoint-detection, visual-document-retrieval, any-to-any, video-to-video, other
T5 Small Intent-Slot Model
This is a fine-tuned T5 model designed for Intent Detection and Slot Filling — a core task in natural language understanding (NLU) for chatbots, virtual assistants, and conversational AI.
What does this model do?
Imagine you’re teaching a smart assistant to understand user requests like:
- “Book a hotel in London for 3 nights.”
- “Find me an Italian restaurant nearby.”
- “What’s the weather tomorrow in Paris?”
This model reads the input sentence and simultaneously figures out:
- The user's intent (e.g., booking, searching)
- The slots (key details like location, date, type)
It outputs a structured sequence that identifies these elements, so your app can respond intelligently.
Model Details
- Based on T5 small architecture (6 layers, 512 hidden size)
- Trained for conditional generation of intents and slots from text
- Uses SentencePiece tokenizer with custom added tokens
- Model weights stored as safetensors for efficiency and safety
Files in this repository
File | Description |
---|---|
config.json |
Model architecture and params |
generation_config.json |
Text generation settings |
model.safetensors |
Model weights |
tokenizer_config.json |
Tokenizer settings |
spiece.model |
SentencePiece tokenizer model |
added_tokens.json |
Custom tokens added during training |
special_tokens_map.json |
Mapping of special tokens |
.gitattributes |
Git LFS config for large files |
README.md |
This documentation |
- Downloads last month
- 3
Model tree for mohamedhoussem45/t5-small-intent-slot
Base model
google/flan-t5-small