ben burtenshaw's picture

ben burtenshaw PRO

burtenshaw

AI & ML interests

None yet

Recent Activity

updated a dataset 1 minute ago
agents-course/certificates
updated a dataset 5 minutes ago
mcp-course/certificates
updated a dataset 6 minutes ago
mcp-course/certificates
View all activity

Organizations

Hugging Face's profile picture The LLM Course's profile picture Argilla's profile picture Blog-explorers's profile picture MLX Community's profile picture distilabel-internal-testing's profile picture Data Is Better Together's profile picture Social Post Explorers's profile picture Hugging Face Discord Community's profile picture argilla-internal-testing's profile picture Open Human Feedback's profile picture Argilla Warehouse's profile picture uplimit's profile picture open/ acc's profile picture Data Is Better Together Contributor's profile picture wut?'s profile picture Open Source AI Research Community's profile picture FeeL (Feedback Loop)'s profile picture Hugging Face Agents Course's profile picture Agents Course Finishers's profile picture Open R1's profile picture Hugging Face Reasoning Course's profile picture llrehf's profile picture Hugging Face MCP Course's profile picture Agents-MCP-Hackathon's profile picture

Posts 34

view post
Post
342
You don't need remote APIs for a coding copliot, or the MCP Course! Set up a fully local IDE with MCP integration using Continue. In this tutorial Continue guides you through setting it up.

This is what you need to do to take control of your copilot:

1. Get the Continue extension from the [VS Code marketplace](https://marketplace.visualstudio.com/items?itemName=Continue.continue) to serve as the AI coding assistant.

2. Serve the model with an OpenAI compatible server in Llama.cpp / LmStudio/ etc.

llama-server -hf unsloth/Devstral-Small-2505-GGUF:Q4_K_M

3. Create a .continue/models/llama-max.yaml file in your project to tell Continue how to use the local Ollama model.
name: Llama.cpp model
    version: 0.0.1
    schema: v1
    models:
      - provider: llama.cpp
        model: unsloth/Devstral-Small-2505-GGUF
        apiBase: http://localhost:8080
        defaultCompletionOptions:
          contextLength: 8192 
    # Adjust based on the model
        name: Llama.cpp Devstral-Small
        roles:
          - chat
          - edit


4. Create a .continue/mcpServers/playwright-mcp.yaml file to integrate a tool, like the Playwright browser automation tool, with your assistant.

name: Playwright mcpServer
    version: 0.0.1
    schema: v1
    mcpServers:
      - name: Browser search
        command: npx
        args:
          - "@playwright/mcp@latest"


Check out the full tutorial in the [the MCP course](https://huggingface.co/learn/mcp-course/unit2/continue-client)

Articles 20

Article
27

Page-to-Video: Generate videos from webpages 🪄🎬