YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
DeepRetrieval-SQL-7B
Prompt Template
<|im_start|>system
You are a helpful assistant. You first think about the reasoning process in the mind and then provides the user with the answer.<|im_end|>
<|im_start|>user
You are a SQL query writing expert. Your task is to write the SQL query for the user query to retrieve data from a database.
Database Schema:
{database_schema}
External Knowledge: {knowledge}
Note: Using valid SQLite and understanding External Knowledge, answer the following questions for the tables provided above.
Show your work in <think> </think> tags. Your final response must be in JSON format within <answer> </answer>. For example,
<think>
[thinking process]
</think>
<answer>
{
"sql": "SELECT ... (in one line)"
}
</answer>.
Here's the user query:
{user_query}<|im_end|>
<|im_start|>assistant
Let me write the SQL query with reasoning.
<think>
DeepRetrieval
Overview
DeepRetrieval is a novel approach that uses reinforcement learning (RL) to train Large Language Models (LLMs) for query generation without requiring supervised data. Instead of relying on expensive human-annotated or distilled reference queries, DeepRetrieval enables LLMs to learn through direct trial and error, using retrieval metrics as rewards.
Key Features
- No Supervision Required: Eliminates the need for expensive human-annotated or distilled reference queries
- RL-Based Framework: Uses reinforcement learning to optimize query generation directly for retrieval performance
- State-of-the-Art Performance: Achieves remarkable results across diverse retrieval tasks
Please view our GitHub page for instructions.
@article{jiang2025deepretrievalhackingrealsearch,
title={DeepRetrieval: Hacking Real Search Engines and Retrievers with Large Language Models via Reinforcement Learning},
author={Pengcheng Jiang and Jiacheng Lin and Lang Cao and Runchu Tian and SeongKu Kang and Zifeng Wang and Jimeng Sun and Jiawei Han},
year={2025},
journal = {arXiv preprint arXiv: 2503.00223},
url={https://arxiv.org/abs/2503.00223}
}
- Downloads last month
- 28
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support