p2p-llm / README.md
arpinfidel's picture
temp
48511d8
metadata
title: P2p Llm
emoji: πŸš€
colorFrom: green
colorTo: purple
sdk: docker
sdk_version: 5.13.0
pinned: false
license: agpl-3.0
short_description: peer to peer LLM inference
app_file: app.py
app_port: 8080
models:
  - deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B
  - unsloth/DeepSeek-R1-Distill-Qwen-1.5B-GGUF

P2P LLM Inference Platform

A peer-to-peer platform for distributed LLM inference.

Current Features

  • Trusted Peers Management:
    • Load peers from configuration file
    • Load peers from database
    • Combine both sources for peer discovery
  • Request Forwarding:
    • Distribute requests across available peers
    • Handle both regular and SSE responses

Planned Features

  • Trust Scores:
    • Track peer reliability and performance
    • Weight request distribution based on scores
  • Peer Advertising:
    • Automatic peer discovery
    • Peer-to-peer network formation
  • Enhanced Security:
    • Peer authentication
    • Request validation

Getting Started

  1. Clone the repository
  2. Configure files/config.json with your settings
  3. Build and run with Docker:
docker-compose up --build

Configuration

Edit files/config.json to specify:

  • Database path
  • Target URL
  • Trusted peers (URLs and public keys)
  • Trusted peers file path

Development

# Run locally
go run main.go

# Run tests
go test ./...

Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference