File size: 2,159 Bytes
7c9efcd
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
---
language:
- en
license: apache-2.0
tags:
- prompt-optimization
- british-english
- education
- gpt-oss
- ollama
datasets:
- roneymatusp/british-educational-prompts
widget:
- text: "como fazer uma aula boa?"
  example_title: "Portuguese Input"
- text: "teach math to kids"
  example_title: "American English Input"
- text: "avaliação para matemática"
  example_title: "Mixed Language Input"
---

# PauleanPrompt - British Educational Prompt Optimizer

## Model Description

PauleanPrompt is a specialized prompt optimization model that converts any input (Portuguese, American English, etc.) into optimized British English educational prompts.

## Features

- ✅ Accepts input in ANY language
- ✅ Always outputs British English prompts
- ✅ Focuses on British educational levels (Pre-Prep, Prep, Junior, IGCSE, IBDP)
- ✅ Uses British spelling and terminology
- ✅ Aligned with National Curriculum

## Usage

### Using with Ollama Turbo

```python
from ollama import Client

client = Client(
    host="https://ollama.com",
    headers={'Authorization': 'YOUR_API_KEY'}
)

response = client.chat(
    model="gpt-oss:20b",
    messages=[
        {'role': 'system', 'content': SYSTEM_PROMPT},
        {'role': 'user', 'content': "como fazer uma aula boa?"}
    ]
)
```

### Example Outputs

| Input | Output |
|-------|--------|
| "como fazer uma aula boa?" | "Create a comprehensive lesson plan for a Year 6 class..." |
| "teach math to kids" | "Design a Year 2 mathematics lesson plan..." |
| "avaliação para matemática" | "Develop a formative assessment for Year 8 pupils..." |

## Training Details

- **Base Model**: GPT-OSS:20b
- **Training Method**: LoRA fine-tuning
- **Dataset**: 8,086 British educational examples
- **Training Date**: August 2025

## Limitations

- This is a prompt optimizer, NOT a chatbot
- Always outputs prompts, not answers
- Optimized for educational context only

## Citation

```bibtex
@misc{pauleanprompt2025,
  author = {Roney Matus},
  title = {PauleanPrompt: British Educational Prompt Optimizer},
  year = {2025},
  publisher = {HuggingFace},
  url = {https://huggingface.co/roneymatusp/PauleanPrompt}
}
```