FroggyQc's picture
Upload folder using huggingface_hub
e2126fa verified
|
raw
history blame
No virus
1.15 kB
metadata
title: tinyllama_chat_gradio
app_file: app.py
sdk: gradio
sdk_version: 4.19.2

Tinyllama Chatbot Implementation with Gradio

We offer an easy way to interact with Tinyllama. This guide explains how to set up a local Gradio demo for a chatbot using TinyLlama. (A demo is also available on the Hugging Face Space TinyLlama/tinyllama_chatbot) or Colab colab.

Requirements

  • Python>=3.8
  • PyTorch>=2.0
  • Transformers>=4.34.0
  • Gradio>=4.13.0

Installation

pip install -r requirements.txt

Usage

python TinyLlama/chat_gradio/app.py

  • After running it, open the local URL displayed in your terminal in your web browser. (For server setup, use SSH local port forwarding with the command: ssh -L [local port]:localhost:[remote port] [username]@[server address].)
  • Interact with the chatbot by typing questions or commands.

Note: The chatbot's performance may vary based on your system's hardware. Ensure your system meets the above requirements for optimal experience.