wb's picture
6 2

wb

whitebill

AI & ML interests

None yet

Recent Activity

Organizations

medical-ai-linkdata's profile picture Hugging Face MCP Course's profile picture

whitebill's activity

reacted to openfree's post with ๐Ÿ‘ 24 days ago
view post
Post
6075
๐Ÿ”ฅ Creating a qwen3-30b-a3b / qwen3-235b-a22b Chatbot with Deep Research Capabilities ๐Ÿš€

openfree/qwen3-30b-a3b-research
openfree/qwen3-235b-a22b-research

Hello AI researchers! ๐Ÿ‘‹ Today I'm introducing a powerful chatbot implementation with real-time web search capabilities.
โœจ Key Features

๐Ÿง  Chatbot based on qwen3-30b-a3b and llama4-maverick models
๐Ÿ” LLM-based optimal keyword extraction
๐ŸŒ Real-time web search using SerpHouse API
๐Ÿ’ฌ Streaming responses for natural conversation experience

๐Ÿ› ๏ธ Technology Stack

Gradio: Implementation of intuitive web interface
Fireworks.ai API: Access to high-performance LLM models
SerpHouse API: Collection of real-time search results

๐ŸŒŸ Application Areas

Question answering systems requiring up-to-date information
Providing current information beyond training data
Delivering reliable information with accurate sources

Add real-time search capabilities to your AI applications with this project! ๐ŸŽ‰ Leave your questions or suggestions in the comments! Let's improve it together~ ๐Ÿ’ช
#LLM #ArtificialIntelligence #WebSearch #Gradio #DeepResearch #OpenSource
reacted to Jaward's post with ๐Ÿ‘ 4 months ago
view post
Post
3663
ByteDance drops OmniHuman๐Ÿ”ฅ
This is peak SOTA performance - flawless natural gestures with perfect lip sync and facial expressions. This is the second time they've released SOTA level talking-heads only this time with hands and body motion.
Project: https://omnihuman-lab.github.io/
ยท
reacted to mlabonne's post with ๐Ÿ‘ 4 months ago
view post
Post
6525
๐Ÿ†• LLM Course 2025 edition!

I updated the LLM Scientist roadmap and added a ton of new information and references. It covers training, datasets, evaluation, quantization, and new trends like test-time compute scaling.

The LLM Course has been incredibly popular (41.3k stars!) and I've been touched to receive many, many messages about how it helped people in their careers.

I know how difficult this stuff can be, so I'm super proud of the impact it had. I want to keep updating it in 2025, especially with the LLM Engineer roadmap.

Thanks everyone, hope you'll enjoy it!

๐Ÿ’ป LLM Course: https://huggingface.co/blog/mlabonne/llm-course
upvoted an article 4 months ago
reacted to as-cle-bert's post with ๐Ÿ”ฅ 5 months ago
view post
Post
2092
๐ŸŽ‰๐„๐š๐ซ๐ฅ๐ฒ ๐๐ž๐ฐ ๐˜๐ž๐š๐ซ ๐ซ๐ž๐ฅ๐ž๐š๐ฌ๐ž๐ฌ๐ŸŽ‰

Hi HuggingFacers๐Ÿค—, I decided to ship early this year, and here's what I came up with:

๐๐๐Ÿ๐ˆ๐ญ๐ƒ๐จ๐ฐ๐ง (https://github.com/AstraBert/PdfItDown) - If you're like me, and you have all your RAG pipeline optimized for PDFs, but not for other data formats, here is your solution! With PdfItDown, you can convert Word documents, presentations, HTML pages, markdown sheets and (why not?) CSVs and XMLs in PDF format, for seamless integration with your RAG pipelines. Built upon MarkItDown by Microsoft
GitHub Repo ๐Ÿ‘‰ https://github.com/AstraBert/PdfItDown
PyPi Package ๐Ÿ‘‰ https://pypi.org/project/pdfitdown/

๐’๐ž๐ง๐“๐ซ๐„๐ฏ ๐ฏ๐Ÿ.๐ŸŽ.๐ŸŽ (https://github.com/AstraBert/SenTrEv/tree/v1.0.0) - If you need to evaluate the ๐—ฟ๐—ฒ๐˜๐—ฟ๐—ถ๐—ฒ๐˜ƒ๐—ฎ๐—น performance of your ๐˜๐—ฒ๐˜…๐˜ ๐—ฒ๐—บ๐—ฏ๐—ฒ๐—ฑ๐—ฑ๐—ถ๐—ป๐—ด models, I have good news for you๐Ÿฅณ๐Ÿฅณ
The new release for ๐’๐ž๐ง๐“๐ซ๐„๐ฏ now supports ๐—ฑ๐—ฒ๐—ป๐˜€๐—ฒ and ๐˜€๐—ฝ๐—ฎ๐—ฟ๐˜€๐—ฒ retrieval (thanks to FastEmbed by Qdrant) with ๐˜๐—ฒ๐˜…๐˜-๐—ฏ๐—ฎ๐˜€๐—ฒ๐—ฑ ๐—ณ๐—ถ๐—น๐—ฒ ๐—ณ๐—ผ๐—ฟ๐—บ๐—ฎ๐˜๐˜€ (.docx, .pptx, .csv, .html, .xml, .md, .pdf) and new ๐—ฟ๐—ฒ๐—น๐—ฒ๐˜ƒ๐—ฎ๐—ป๐—ฐ๐—ฒ ๐—บ๐—ฒ๐˜๐—ฟ๐—ถ๐—ฐ๐˜€!
GitHub repo ๐Ÿ‘‰ https://github.com/AstraBert/SenTrEv
Release Notes ๐Ÿ‘‰ https://github.com/AstraBert/SenTrEv/releases/tag/v1.0.0
PyPi Package ๐Ÿ‘‰ https://pypi.org/project/sentrev/

Happy New Year and have fun!๐Ÿฅ‚
  • 2 replies
ยท
reacted to awacke1's post with ๐Ÿ‘ 5 months ago
reacted to cfahlgren1's post with ๐Ÿ‘ 5 months ago
reacted to singhsidhukuldeep's post with ๐Ÿš€ 5 months ago
view post
Post
3731
Exciting breakthrough in AI: @Meta 's new Byte Latent Transformer (BLT) revolutionizes language models by eliminating tokenization!

The BLT architecture introduces a groundbreaking approach that processes raw bytes instead of tokens, achieving state-of-the-art performance while being more efficient and robust. Here's what makes it special:

>> Key Innovations
Dynamic Patching: BLT groups bytes into variable-sized patches based on entropy, allocating more compute power where the data is more complex. This results in up to 50% fewer FLOPs during inference compared to traditional token-based models.

Three-Component Architecture:
โ€ข Lightweight Local Encoder that converts bytes to patch representations
โ€ข Powerful Global Latent Transformer that processes patches
โ€ข Local Decoder that converts patches back to bytes

>> Technical Advantages
โ€ข Matches performance of Llama 3 at 8B parameters while being more efficient
โ€ข Superior handling of non-English languages and rare character sequences
โ€ข Remarkable 99.9% accuracy on spelling tasks
โ€ข Better scaling properties than token-based models

>> Under the Hood
The system uses an entropy model to determine patch boundaries, cross-attention mechanisms for information flow, and hash n-gram embeddings for improved representation. The architecture allows simultaneous scaling of both patch and model size while maintaining fixed inference costs.

This is a game-changer for multilingual AI and could reshape how we build future language models. Excited to see how this technology evolves!
ยท
reacted to clem's post with ๐Ÿš€ 5 months ago
view post
Post
2126
Coming back to Paris Friday to open our new Hugging Face office!

We're at capacity for the party but add your name in the waiting list as we're trying to privatize the passage du Caire for extra space for robots ๐Ÿค–๐Ÿฆพ๐Ÿฆฟ

https://t.co/enkFXjWndJ
  • 1 reply
ยท