Spaces:
Running
on
Zero
Apply for community grant: Academic project (gpu)
Hi! This Space is for a demo accompanying the paper "Pictures Of MIDI", an exploration of inpainting images for MIDI piano rolls for controllable music generation, that has been under the ISMIR 2024 conference's double-blind review embargo for the past few months, but the embargo is being lifted today! Paper & examples page:
https://picturesofmidi.github.io/PicturesOfMIDI/
(paper's not on arxiv yet but will be soon)
The Hourglass Diffusion Transformer uses an efficient O(N) method but still needs a GPU; on CPU only, the Space is so slow it's hard to verify it's even working. (It does work on my local GPU though!) In fact, on CPU only, it only gets through 4% of the inference progress bar before the HF system issues a timeout (thus you see "Runtime error"). :sad-emoji:. On a GPU, inference should take no more than 30 seconds.
I am a solo researcher at a small college, so it would be very helpful to get a couple weeks of GPU hosting once I post the paper to arXiv.
In case this matters or helps: The app.py essentially consumes no persistent VRAM. VRAM is only used during explicit calls to the inferencing sub-program via a subprocess call and then is immediately freed. So.... in theory, this app could share 'space' with another app or instance and only intermittently risk clobbering the VRAM at the same time. (Probably doesn't help but wanted to share anyway.)
Hi @drscotthawley , we've assigned ZeroGPU to this Space. Please check the compatibility and usage sections of this page so your Space can run on ZeroGPU.