No Task Left Behind: Isotropic Model Merging with Common and Task-Specific Subspaces Paper • 2502.04959 • Published 4 days ago • 9
No Task Left Behind: Isotropic Model Merging with Common and Task-Specific Subspaces Paper • 2502.04959 • Published 4 days ago • 9
No Task Left Behind: Isotropic Model Merging with Common and Task-Specific Subspaces Paper • 2502.04959 • Published 4 days ago • 9 • 2
SAeUron: Interpretable Concept Unlearning in Diffusion Models with Sparse Autoencoders Paper • 2501.18052 • Published 13 days ago • 6
MagMax: Leveraging Model Merging for Seamless Continual Learning Paper • 2407.06322 • Published Jul 8, 2024 • 1
Revisiting Supervision for Continual Representation Learning Paper • 2311.13321 • Published Nov 22, 2023
Category Adaptation Meets Projected Distillation in Generalized Continual Category Discovery Paper • 2308.12112 • Published Aug 23, 2023