Papers
arxiv:2507.14241

Promptomatix: An Automatic Prompt Optimization Framework for Large Language Models

Published on Jul 17
· Submitted by rmurthy on Jul 24
Authors:
,
,
,
,
,
,
,

Abstract

Large Language Models (LLMs) perform best with well-crafted prompts, yet prompt engineering remains manual, inconsistent, and inaccessible to non-experts. We introduce Promptomatix, an automatic prompt optimization framework that transforms natural language task descriptions into high-quality prompts without requiring manual tuning or domain expertise. Promptomatix supports both a lightweight meta-prompt-based optimizer and a DSPy-powered compiler, with modular design enabling future extension to more advanced frameworks. The system analyzes user intent, generates synthetic training data, selects prompting strategies, and refines prompts using cost-aware objectives. Evaluated across 5 task categories, Promptomatix achieves competitive or superior performance compared to existing libraries, while reducing prompt length and computational overhead making prompt optimization scalable and efficient.

Community

Paper author Paper submitter

Promptomatix: An Automatic Prompt Optimization Framework for Large Language Models

Help an S.Y.B.Sc.IT studen to write his field study assignment. The topic is Indian Theatre History and the Modern Era.
The assignment should be of 15 pages.
(The pages I'm using are 21 × 28 cm in size.)

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2507.14241 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2507.14241 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2507.14241 in a Space README.md to link it from this page.

Collections including this paper 9