How should I extract attention maps? Can you provide a specific example?
#33
by
whopeople
- opened
extract attention maps
@whopeople Maybe you can reuse this code: https://github.com/TinyLLaVA/TinyLLaVA_Factory/blob/main/tinyllava_visualizer/tinyllava_visualizer.py
thanks@Maverick17
@Maverick17 I cannot apply it to that model.Are there any examples specific to Molmo 7B-D?
@whopeople Sorry, but I don't know... I just knew that something like attention map visualization exists for llava and I was hoping that you can somehow apply their stuff to molmo.
@Maverick17 OK,Thank you very much