NCU SmartLLM โ€” Mistral 7B Fine-tuned on NorthCap University Curriculum

NCU SmartLLM is a domain-specialized version of Mistral-7B-Instruct, fine-tuned on real academic material from The NorthCap University (NCU). It is built to assist with college-level instruction, syllabus summarization, technical definitions, and Q&A across engineering and computer science disciplines.


Features

  • Fine-tuned on 1200+ course entries from NCU's official syllabus
  • Based on Mistral-7B-Instruct, known for concise and instruction-following responses
  • Trained using LoRA (PEFT) and quantized to 4-bit (bnb + bitsandbytes)
  • Combined data: structured course PDFs + scraped university site content
  • Optimized for academic question answering and domain-specific tasks

Usage

from transformers import pipeline

pipe = pipeline(
    "text-generation",
    model="pranav2711/ncu-smartllm",
    tokenizer="pranav2711/ncu-smartllm",
    device_map="auto"
)

prompt = """### Instruction:
What is the role of the compiler and linker in the C language?

### Response:
"""

output = pipe(prompt, max_new_tokens=200, temperature=0.7)
print(output[0]["generated_text"])

Training Details

  • Base model: mistralai/Mistral-7B-Instruct-v0.1

  • Method: Parameter-efficient fine-tuning with LoRA using PEFT

  • Quantization: 4-bit (bnb4bit) for fast training on Colab Pro

  • Frameworks: transformers, bitsandbytes, peft, accelerate

  • Data sources:

  • Training Time: ~3.5 hours on A100 via Google Colab Pro


Example Prompts

### Instruction:
Explain the difference between a compiler and an interpreter.

### Response:
A compiler translates an entire program into machine code before execution...
### Instruction:
What is the credit structure and content for FOCP-I?

### Response:
FOCP-I (CSL106) is structured as (2-0-4) credits...
### Instruction:
List types of number systems in computing and their conversions.

### Response:
There are four primary number systems: Binary, Decimal, Octal, and Hexadecimal...

License

This model is licensed under the Apache 2.0 License.


Author

Built and fine-tuned by @pranav2711 Project: SmartLLM for NCU โ€” empowering education with custom LLMs.


Citation

If you use this model in academic work or research:

@misc{ncu2024smartllm,
  title     = {NCU SmartLLM: Fine-tuned Mistral 7B on Academic Syllabus},
  author    = {Pranav},
  year      = {2024},
  publisher = {Hugging Face},
  howpublished = {\url{https://huggingface.co/pranav2711/ncu-smartllm}}
}
Downloads last month
13
Safetensors
Model size
3.86B params
Tensor type
F32
ยท
F16
ยท
U8
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support