Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Scott240866
's Collections
ui-tars-7b-dpo
ui-tars-7b-dpo
updated
28 days ago
Upvote
-
ByteDance-Seed/UI-TARS-7B-DPO
Image-Text-to-Text
•
Updated
Jan 25
•
32.4k
•
205
Upvote
-
Share collection
View history
Collection guide
Browse collections