|
--- |
|
language: |
|
- en |
|
base_model: |
|
- Qwen/Qwen2.5-7B |
|
license: apache-2.0 |
|
--- |
|
|
|
# ScholarCopilot-v1 Model |
|
|
|
ScholarCopilot-v1 is the foundation model of [Scholar Copilot](https://arxiv.org/abs/2504.00824). Scholar Copilot improves the academic writing process by seamlessly integrating automatic text completion and intelligent citation suggestions into a cohesive, human-in-the-loop AI-driven pipeline. Designed to enhance productivity and creativity, it provides researchers with high-quality text generation and precise citation recommendations powered by iterative and context-aware Retrieval-Augmented Generation (RAG). |
|
|
|
The current version of Scholar Copilot leverages a state-of-the-art 7-billion-parameter language model (LLM) trained on the complete Arxiv full paper corpus. This unified model for retrieval and generation is adept at making context-sensitive decisions about when to cite, what to cite, and how to generate coherent content based on reference papers. |
|
|
|
Links: [paper](https://arxiv.org/abs/2504.00824) | [model](https://huggingface.co/TIGER-Lab/ScholarCopilot-v1) | [demo](https://huggingface.co/spaces/TIGER-Lab/ScholarCopilot) |
|
|
|
|
|
## π Key Features |
|
|
|
- ** π Next-3-Sentence Suggestions: Facilitates writing by predicting the next sentences with automatic retrieval and citation of relevant reference papers. |
|
- ** π Citation Suggestions on Demand: Provides precise, contextually appropriate paper citations whenever needed. |
|
- ** β¨ Full Section Auto-Completion: Assists in brainstorming and drafting comprehensive paper content and structure. |
|
|
|
The current version of ScholarCopilot primarily focuses on the introduction and related work sections of academic papers. We will support full-paper writing in future releases. |
|
|
|
## Citation |
|
|
|
Please cite our paper with |
|
``` |
|
@article{wang2024scholarcopilot, |
|
title={ScholarCopilot: Training Large Language Models for Academic Writing with Accurate Citations}, |
|
author = {Wang, Yubo and Ma, Xueguang and Nie, Ping and Zeng, Huaye and Lyu, Zhiheng and Zhang, Yuxuan and Schneider, Benjamin and Lu, Yi and Yue, Xiang and Chen, Wenhu}, |
|
journal={arXiv preprint arXiv:2504.00824}, |
|
year={2025} |
|
} |
|
``` |