## 🚨License Conflict: Apache-2.0 vs Qwen Research License
Hi, I'd like to report a License Conflict in remyxai/SpaceQwen2.5-VL-3B-Instruct
. I noticed this model was quantized from Qwen/Qwen2.5-VL-3B-Instruct
, which is released under the Qwen RESEARCH LICENSE AGREEMENT. From what I can see, remyxai/SpaceQwen2.5-VL-3B-Instruct
is currently licensed under Apache-2.0, which may not be compatible with the original Qwen license.
⚠️ Key violations of Qwen RESEARCH LICENSE AGREEMENT:
Section 3 – Redistribution:
• You must provide a copy of the Qwen license to all recipients
• Must include a "NOTICE" file with the attribution text:
"Qwen is licensed under the Qwen RESEARCH LICENSE AGREEMENT, Copyright (c) Alibaba Cloud. All Rights Reserved."
• You can’t relicense Qwen or its derivatives under more permissive terms
Section 4 – Use Restrictions:
• Any derivative model must prominently display:
“Built with Qwen” or “Improved using Qwen”
• Must comply with export control laws and other applicable regulations
Section 5.a – Ownership:
• Even if you modify Qwen, Alibaba retains the rights to the original and derivative works
So while both models may be open and research-focused, replacing the Qwen license with Apache-2.0 might unintentionally grant broader rights than permitted, which could cause confusion or legal risk for downstream users.
So while both models may be open and research-focused, replacing the Qwen license with Apache-2.0 might unintentionally grant broader rights than permitted, which could cause confusion or legal risk for downstream users.
🔹 Friendly Suggestions😊:
To help keep things aligned with the original license, it might be worth:
✅ Adding the full Qwen Research License to the model card or repository
✅ Including a "NOTICE" file with the required attribution
✅ Clarifying that the model is a derivative of Qwen and follows its license
Really appreciate the awesome work on Proxy-Lite — just raising this to help keep licensing clear for everyone building on top of it. 😊 Let me know if I misunderstood anything — happy to help clarify!
Thanks for your attention!
Hi @xixi126 ! Thank you for bringing this to our attention - we updated the license to reflect the one associated with the base model. 👍