Spaces:
Runtime error
Runtime error
title: AgenticAi | |
emoji: 🧠 | |
colorFrom: blue | |
colorTo: purple | |
sdk: gradio | |
sdk_version: "4.18.0" | |
app_file: app.py | |
pinned: false | |
Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference | |
# Central AI Hub | |
This is a Hugging Face Space for the Central AI Hub. | |
## How to Run | |
1. Make sure you have the required dependencies installed. You can install them using: | |
```bash | |
pip install -r requirements.txt | |
``` | |
2. To run the space, execute the following command: | |
```bash | |
python app.py | |
``` | |
This will start the Gradio interface, which you can access in your browser. | |
3. You can optionally provide a local path to the Qwen2.5-14B GGUF model by setting the `LOCAL_MODEL_PATH` environment variable. For example: | |
```bash | |
LOCAL_MODEL_PATH=/path/to/your/model.gguf python app.py | |
``` | |
If the `LOCAL_MODEL_PATH` environment variable is not set, the model will be downloaded from Hugging Face Hub and cached in the `.cache` directory. | |
## Usage | |
1. Select a task type from the dropdown menu. | |
2. Enter the task content in the text box. | |
3. Enter any task requirements, separated by commas. | |
4. Click the "Submit Task" button. | |
5. The task status will be displayed in the output text box. | |