--- base_model: unsloth/gemma-3-4b-it-unsloth-bnb-4bit tags: - text-generation-inference - transformers - unsloth - gemma3 license: apache-2.0 language: - en --- # ๐Ÿง  Gemma 3 (4B) Fine-Tuned on UnoPIM Docs โ€” by Webkul This is a fine-tuned version of [`unsloth/gemma-3-4b-it-unsloth-bnb-4bit`](https://huggingface.co/unsloth/gemma-3-4b-it-unsloth-bnb-4bit), optimized and accelerated with [Unsloth](https://github.com/unslothai/unsloth) and Hugging Face's TRL for instruction-based text generation tasks. --- ## ๐Ÿ” Model Summary - **Base Model:** `unsloth/gemma-3-4b-it-unsloth-bnb-4bit` - **Fine-Tuned By:** [Webkul](https://webkul.com) - **License:** Apache-2.0 - **Language:** English - **Model Type:** Instruction-tuned (4-bit quantized) - **Training Boost:** ~2x faster training with Unsloth optimizations --- ## ๐Ÿ“š Fine-Tuning Dataset This model has been fine-tuned specifically on official UnoPIM documentation and user guides available at: ๐Ÿ‘‰ **[https://docs.unopim.com/](https://docs.unopim.com/)** ### Content Covered: - Product Information Management (PIM) workflows - Admin dashboard and module configurations - API usage and endpoints - User roles and access control - Product import/export and sync logic - Custom field and attribute setups - Troubleshooting and common use cases --- ## ๐Ÿ’ก Use Cases This model is designed for: - ๐Ÿงพ **Q&A on UnoPIM documentation** - ๐Ÿ’ฌ **Chatbots for UnoPIM technical support** - ๐Ÿง  **Contextual assistants inside dev tools** - ๐Ÿ› ๏ธ **Knowledge base automation for onboarding users** --- ## ๐Ÿš€ Quick Start You can run this model with Hugging Faceโ€™s `transformers` library: ```python from transformers import AutoTokenizer, AutoModelForCausalLM model_id = "webkul/gemma-3-4b-it-unopim-docs" tokenizer = AutoTokenizer.from_pretrained(model_id) model = AutoModelForCausalLM.from_pretrained(model_id) prompt = "How can I import products in bulk using UnoPIM?" inputs = tokenizer(prompt, return_tensors="pt") outputs = model.generate(**inputs, max_new_tokens=300) print(tokenizer.decode(outputs[0], skip_special_tokens=True)) ``` ๐Ÿ“„ License This model is distributed under the Apache 2.0 License. See LICENSE for more information.