agenticAi / README.md
Cline
Added space configuration to README.md
1112a05

A newer version of the Gradio SDK is available: 5.13.1

Upgrade
metadata
title: AgenticAi
emoji: 🧠
colorFrom: blue
colorTo: purple
sdk: gradio
sdk_version: 4.18.0
app_file: app.py
pinned: false

Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference

Central AI Hub

This is a Hugging Face Space for the Central AI Hub.

How to Run

  1. Make sure you have the required dependencies installed. You can install them using:

    pip install -r requirements.txt
    
  2. To run the space, execute the following command:

    python app.py
    

    This will start the Gradio interface, which you can access in your browser.

  3. You can optionally provide a local path to the Qwen2.5-14B GGUF model by setting the LOCAL_MODEL_PATH environment variable. For example:

    LOCAL_MODEL_PATH=/path/to/your/model.gguf python app.py
    

    If the LOCAL_MODEL_PATH environment variable is not set, the model will be downloaded from Hugging Face Hub and cached in the .cache directory.

Usage

  1. Select a task type from the dropdown menu.
  2. Enter the task content in the text box.
  3. Enter any task requirements, separated by commas.
  4. Click the "Submit Task" button.
  5. The task status will be displayed in the output text box.