jiayanmu PRO
wangbuer999
AI & ML interests
None yet
Recent Activity
reacted
to
zc277584121's
post
with 🔥
about 17 hours ago
We've open-sourced a bilingual Semantic Highlighting model that can power multiple production scenarios:
1) RAG Answer Highlighting — Automatically highlight the exact sentences that answer user queries, improving interpretability and helping users quickly locate relevant information.
2) RAG Noise Filtering — Prune irrelevant context before sending to LLMs, achieving 70-80% token cost reduction while improving answer quality by letting the model focus on what matters.
3) Search System Highlighting — Add semantic highlighting features to recommendation systems, e-commerce search, or any retrieval system where users need to see why a result is relevant.
Try it out: https://huggingface.co/zilliz/semantic-highlight-bilingual-v1
Read our article: https://huggingface.co/blog/zilliz/zilliz-semantic-highlight-model
reacted
to
their
post
with 🚀
1 day ago
HY-MT1.5-1.8B Lightweight Translation Model Open-Source Game-Changer
Tencent raised the bar for lightweight translation!
Supports bidirectional translation across 36 languages total—33 mainstream languages + 5 ethnic/minority dialects
With only 1.8B parameters (less than 1/3 the size of HY-MT1.5-7B), it delivers performance on par with the 7B counterpart and outperforms most commercial translation APIs.
✅ Quantized versions (FP8/GPTQ-Int4) available for edge device deployment, perfect for real-time translation
✅ Full support for terminology intervention, context-aware translation, and formatted output
✅ Ready-to-use prompt templates + seamless integration with Hugging Face Transformers
✅ Recommended transformers ≥ 4.56.0 (FP8 model requires compressed-tensors 0.11.0)
10+ Hugging Face Spaces already integrated this model!
👉 Model Repo: https://huggingface.co/tencent/HY-MT1.5-1.8B
👉 Technical Report: https://arxiv.org/abs/2512.24092
reacted
to
their
post
with 🔥
1 day ago
HY-MT1.5-1.8B Lightweight Translation Model Open-Source Game-Changer
Tencent raised the bar for lightweight translation!
Supports bidirectional translation across 36 languages total—33 mainstream languages + 5 ethnic/minority dialects
With only 1.8B parameters (less than 1/3 the size of HY-MT1.5-7B), it delivers performance on par with the 7B counterpart and outperforms most commercial translation APIs.
✅ Quantized versions (FP8/GPTQ-Int4) available for edge device deployment, perfect for real-time translation
✅ Full support for terminology intervention, context-aware translation, and formatted output
✅ Ready-to-use prompt templates + seamless integration with Hugging Face Transformers
✅ Recommended transformers ≥ 4.56.0 (FP8 model requires compressed-tensors 0.11.0)
10+ Hugging Face Spaces already integrated this model!
👉 Model Repo: https://huggingface.co/tencent/HY-MT1.5-1.8B
👉 Technical Report: https://arxiv.org/abs/2512.24092