Freeekyyy-chatBot / README.md
MKCL's picture
Update README.md
4e32bcc verified

A newer version of the Streamlit SDK is available: 1.50.0

Upgrade
metadata
license: mit
title: Freeekyyy-chatBot
sdk: streamlit
sdk_version: 1.44.1

πŸ€– Freeekyyy ChatBot

Freeekyyy is an over-the-top, emotional AI chatbot that FREAKS OUT (in Markdown!) on any topic you provide.
It uses LangChain + OpenRouter to generate expressive, explosive Markdown responses β€” perfect for dramatic, chaotic, and wildly informative outputs.

πŸ”₯ Now powered with a RAG (Retrieval-Augmented Generation) pipeline to respond using your own PDFs and documents!

Check it out live πŸ‘‰ MKCL/Freeekyyy-chatBot on Hugging Face 🀯


🧠 How It Works

  • Uses LangChain's ChatPromptTemplate to inject emotional few-shot prompts.
  • Connects to DeepSeek-R1-Zero via OpenRouter.
  • Uses vector search (via ChromaDB) and HuggingFace embeddings for document retrieval (RAG).
  • Outputs responses in beautiful Markdown (.md) format.
  • Works as a Streamlit app or a FastAPI backend.

πŸ” Retrieval-Augmented Generation (RAG)

The chatbot now includes a smart document processing pipeline:

  1. Document Ingestion: Parses your uploaded PDF files.
  2. Chunking: Splits them into overlapping text chunks.
  3. Embeddings: Generates vector embeddings using BAAI/bge-small-en.
  4. Vector Store: Stores chunks in ChromaDB.
  5. Context Injection: Relevant chunks are inserted into the LLM prompt for context-aware responses!

πŸ–₯️ Streamlit Integration

To display Markdown output in Streamlit:

import streamlit as st

# Assuming `md_output` contains your model's response
st.markdown(md_output, unsafe_allow_html=True)

πŸš€ Installation

Option 1: Using uv

uv pip install -r requirements.txt

Option 2: Using regular pip

pip install -r requirements.txt

πŸ“¦ Requirements

langchain
langchain-community
langchain-openai
openai
chromadb
python-dotenv
huggingface_hub
sentence-transformers
streamlit
uvicorn
fastapi

πŸ› οΈ Environment Variables

Create a .env file in the root directory:

OPENROUTER_API_KEY=your_openrouter_key_here
HUGGINGFACE_API_KEY=your_huggingface_key_here

πŸ§ͺ Example Prompt Structure

from langchain.prompts import ChatPromptTemplate

prompt = ChatPromptTemplate.from_messages([
    ("system", "You're an extremely emotional AI. Always freak out in Markdown."),
    ("user", "Topic: Volcanoes")
])

πŸ”— RAG Query with Vector Search

# Sample retrieval pipeline
relevant_chunks = db.similarity_search(query, k=4)
context = "\n\n".join([doc.page_content for doc in relevant_chunks])

final_prompt = f"""
You are an emotional assistant. Respond dramatically using Markdown.

Context:
{context}

Question:
{query}
"""

πŸ§‘β€πŸ’» Want to Use as an API?

Run your backend like this:

uvicorn main:app --reload

πŸ“Ž License

MIT β€” go freak out and teach some AI emotions! 🀯❀️πŸ”₯