AI & ML interests

None defined yet.

Recent Activity

burtenshawย  updated a dataset 4 minutes ago
agents-course/certificates
Jofthomasย  updated a dataset 8 minutes ago
agents-course/unit4-students-scores
sergiopaniegoย  updated a dataset about 2 hours ago
agents-course/final-certificates
View all activity

sergiopaniegoย 
posted an update about 14 hours ago
sergiopaniegoย 
posted an update about 15 hours ago
view post
Post
92
You can already play with two of the latest most impressive models on HF via @novita-ai as Inference Provider ๐Ÿšจ

๐ŸŒŒ Kimi K2: 1T params model, MoE beast for coding, reasoning and agentic tasks
๐Ÿ”ฎ GLM-4.1V-9B-Thinking: VLM + deep reasoning model

Kimi K2: moonshotai/Kimi-K2-Instruct
GLM-4.1V-9B-Thinking: THUDM/GLM-4.1V-9B-Thinking
sergiopaniegoย 
posted an update about 21 hours ago
sergiopaniegoย 
posted an update 6 days ago
view post
Post
1492
Test SmolLM3, the newest fully open model released by @HuggingFaceTB !

It's smol (3B), multilingual (6 languages), comes with dual mode reasoning (think/no_think modes) and supports long-context (128k).

Try it now in the notebook below!! โฌ‡๏ธ

Colab notebook: https://colab.research.google.com/github/sergiopaniego/samples/blob/main/smollm3_3b_inference.ipynb
notebook: https://github.com/sergiopaniego/samples/blob/main/smollm3_3b_inference.ipynb
blog: https://huggingface.co/blog/smollm3
sergiopaniegoย 
posted an update 12 days ago
view post
Post
1942
Updated my HF Space for vibe testing smol VLMs on object detection, visual grounding, keypoint detection & counting! ๐Ÿ‘“

๐Ÿ†• Compare Qwen2.5 VL 3B vs Moondream 2B side-by-side with annotated images & text outputs.

Try examples or test your own images! ๐Ÿƒ

๐Ÿ“ฑSpace: sergiopaniego/vlm_object_understanding
burtenshawย 
posted an update 14 days ago
view post
Post
2736
Inference for generative ai models looks like a mine field, but thereโ€™s a simple protocol for picking the best inference:

๐ŸŒ 95% of users >> If youโ€™re using open (large) models and need fast online inference, then use Inference providers on auto mode, and let it choose the best provider for the model. https://huggingface.co/docs/inference-providers/index

๐Ÿ‘ท fine-tuners/ bespoke >> If youโ€™ve got custom setups, use Inference Endpoints to define a configuration from AWS, Azure, GCP. https://endpoints.huggingface.co/

๐Ÿฆซ Locals >> If youโ€™re trying to stretch everything you can out of a server or local machine, use Llama.cpp, Jan, LMStudio or vLLM. https://huggingface.co/settings/local-apps#local-apps

๐ŸชŸ Browsers >> If you need open models running right here in the browser, use transformers.js. https://github.com/huggingface/transformers.js

Let me know what youโ€™re using, and if you think itโ€™s more complex than this.
sergiopaniegoย 
posted an update 15 days ago
view post
Post
1028
๐Ÿ“ฃ CALL FOR CONTRIBUTORS! ๐Ÿ“ฃ

Following last weekโ€™s full release of Gemma 3n, we launched a dedicated recipes repo to explore and share use cases. We already added some! ๐Ÿง‘โ€๐Ÿณ

Now weโ€™re inviting the community to contribute and showcase how these models shine! โœจ

Let them cook.

Check it out: https://github.com/huggingface/huggingface-gemma-recipes/issues/4
  • 1 reply
ยท
sergiopaniegoย 
posted an update 22 days ago
burtenshawย 
posted an update 29 days ago
view post
Post
942
You don't need remote APIs for a coding copliot, or the MCP Course! Set up a fully local IDE with MCP integration using Continue. In this tutorial Continue guides you through setting it up.

This is what you need to do to take control of your copilot:

1. Get the Continue extension from the [VS Code marketplace](https://marketplace.visualstudio.com/items?itemName=Continue.continue) to serve as the AI coding assistant.

2. Serve the model with an OpenAI compatible server in Llama.cpp / LmStudio/ etc.

llama-server -hf unsloth/Devstral-Small-2505-GGUF:Q4_K_M

3. Create a .continue/models/llama-max.yaml file in your project to tell Continue how to use the local Ollama model.
name: Llama.cpp model
    version: 0.0.1
    schema: v1
    models:
      - provider: llama.cpp
        model: unsloth/Devstral-Small-2505-GGUF
        apiBase: http://localhost:8080
        defaultCompletionOptions:
          contextLength: 8192 
    # Adjust based on the model
        name: Llama.cpp Devstral-Small
        roles:
          - chat
          - edit


4. Create a .continue/mcpServers/playwright-mcp.yaml file to integrate a tool, like the Playwright browser automation tool, with your assistant.

name: Playwright mcpServer
    version: 0.0.1
    schema: v1
    mcpServers:
      - name: Browser search
        command: npx
        args:
          - "@playwright/mcp@latest"


Check out the full tutorial in the [the MCP course](https://huggingface.co/learn/mcp-course/unit2/continue-client)
  • 1 reply
ยท
burtenshawย 
posted an update about 1 month ago
view post
Post
1619
Brand new MCP Course has units are out, and now it's getting REAL! We've collaborated with Anthropic to dive deep into production ready and autonomous agents using MCP

๐Ÿ”— mcp-course

This is what the new material covers and includes:

- Use Claude Code to build an autonomous PR agent
- Integrate your agent with Slack and Github to integrate it with you Team
- Get certified on your use case and share with the community
- Build an autonomous PR cleanup agent on the Hugging Face hub and deploy it with spaces

The material goes deep into these problems and helps you to build applications that work. Weโ€™re super excited to see what you build with it.
burtenshawย 
posted an update about 1 month ago
view post
Post
1518
Super excited to release Autotrain MCP. This is an MCP server for training AI models, so you can use your AI tools to train your AI models ๐Ÿคฏ.

๐Ÿ”— burtenshaw/autotrain-mcp

Use this MCP server with tools like Claude Desktop, Cursor, VSCode, or Continue to do this:

- Define an ML problem like Image Classification, LLM fine-tuning, Text Classification, etc.
- The AI can retrieve models and datasets from the hub using the hub MCP.
- Training happens on a Hugging Face space, so no worries about hardware restraints.
- Models are pushed to the hub to be used inference tools like Llama.cpp, vLLM, MLX, etc.
- Built on top of the AutoTrain library, so it has full integration with transformers and other libraries.

Everything is still under active development, but Iโ€™m super excited to hear what people build, and Iโ€™m open to contributions!
  • 1 reply
ยท
burtenshawย 
posted an update about 2 months ago
view post
Post
2644
MCP course is now LIVE! We just dropped quizzes, videos, and live streams to make it a fully interactive course:

๐Ÿ”— join in now: mcp-course

- Itโ€™s still free!
- Video 1 walks you through onboarding to the course
- The first live session is next week!
- You can now get a certificate via exam app
- We improved and written material with interactive quizzes

If youโ€™re studying MCP and want a live, interactive, visual, certified course, then join us on the hub!
Jofthomasย 
posted an update about 2 months ago
view post
Post
3505
Meet our new agentic model : ๐——๐—ฒ๐˜ƒ๐˜€๐˜๐—ฟ๐—ฎ๐—น

Devstral is an open-source LLM built software engineering tasks built under a collaboration between Mistral AI and All Hands AI ๐Ÿ™Œ.

๐—ž๐—ฒ๐˜† ๐—ณ๐—ฒ๐—ฎ๐˜๐˜‚๐—ฟ๐—ฒ๐˜€ :
โ€ข ๐Ÿค– ๐—”๐—ด๐—ฒ๐—ป๐˜๐˜€ : perfect for Agentic coding
โ€ข ๐Ÿƒ ๐—น๐—ถ๐—ด๐—ต๐˜๐˜„๐—ฒ๐—ถ๐—ด๐—ต๐˜: Devstral is a ๐Ÿฎ๐Ÿฐ๐—• parameter based on Mistral small.
โ€ข ยฉ๏ธ ๐—”๐—ฝ๐—ฎ๐—ฐ๐—ต๐—ฒ ๐Ÿฎ.๐Ÿฌ, meaning fully open-source !
โ€ข ๐Ÿ“„ A ๐Ÿญ๐Ÿฎ๐Ÿด๐—ธ context window.

๐Ÿ“šBlog : https://mistral.ai/news/devstral
โšกAPI : The model is also available on our API under the name ๐—ฑ๐—ฒ๐˜ƒ๐˜€๐˜๐—ฟ๐—ฎ๐—น-๐˜€๐—บ๐—ฎ๐—น๐—น-๐Ÿฎ๐Ÿฑ๐Ÿฌ๐Ÿฑ
๐Ÿค— repo : mistralai/Devstral-Small-2505

Can't wait to see what you will build with it !
  • 1 reply
ยท
burtenshawย 
posted an update about 2 months ago
view post
Post
3240
We're thrilled to announce the launch of our comprehensive Model Context Protocol (MCP) Course! This free program is designed to take learners from foundational understanding to practical application of MCP in AI.

Follow the course on the hub: mcp-course

In this course, you will:
๐Ÿ“– Study Model Context Protocol in theory, design, and practice.
๐Ÿง‘โ€๐Ÿ’ป Learn to use established MCP SDKs and frameworks.
๐Ÿ’พ Share your projects and explore applications created by the community.
๐Ÿ† Participate in challenges and evaluate your MCP implementations.
๐ŸŽ“ Earn a certificate of completion.

At the end of this course, you'll understand how MCP works and how to build your own AI applications that leverage external data and tools using the latest MCP standards.
  • 1 reply
ยท