CRANE AI Model
CRANE (Compressed Routing and Neural Embedding) hibrit yapay zeka sistemi.
Model Description
CRANE AI, 4 farklı özelleşmiş modülü birleştiren hibrit bir AI sistemidir:
- CodeModule: Kod yazma (DeepSeek-Coder 1.3B)
- ChatModule: Genel sohbet (Qwen2.5 1.5B)
- ReasonModule: Mantık yürütme (Phi-3 Mini)
- FastModule: Hızlı yanıtlar (TinyLlama 1.1B)
Quick Start
from crane_ai import CRANEAISystem
import asyncio
# Sistemi başlat
system = CRANEAISystem()
await system.initialize()
# Kullanım
response = await system.process_query("Python'da bir hesap makinesi yaz", {})
print(response["response"])
Installation
pip install -r requirements.txt
python setup.py install
Usage
API Sunucusu
python main.py
Programmatic Usage
from crane_ai import CRANEAISystem
system = CRANEAISystem()
await system.initialize()
result = await system.process_query("Your query here", {})
Fine-tuning
CRANE AI LoRA/QLoRA ile fine-tune edilebilir:
python training/fine_tune.py --module code_module --data your_data.jsonl
Model Architecture
Input Query → Router → Best Module → Response
↓
[CodeModule, ChatModule, ReasonModule, FastModule]
↓
Token Capsule Layer → Memory Management
Training Data
Base modeller açık kaynak dataseti ile eğitilmiş:
- Code: GitHub repositories
- Chat: Conversation datasets
- Reasoning: Logic puzzles
- Fast: Q&A pairs
Limitations
- GPU memory: ~4GB gerekli
- Response time: 1-5 saniye
- Context length: 4096 tokens max
License
MIT License - Commercial use allowed.
Citation
@misc{crane-ai-2024,
title={CRANE AI: Hybrid Multi-Model System},
author={Veteroner},
year={2024},
url={https://huggingface.co/veteroner/Novaai}
}
- Downloads last month
- 3
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support