jkawamoto's picture
Update README.md
5db99db verified
---
license: apache-2.0
language:
- en
---
# Metal compatible models for Frame Interpolation in PyTorch
This repository contains exported models for [Frame interpolation in PyTorch](https://github.com/dajes/frame-interpolation-pytorch) and [ComfyUI Frame Interpolation \(ComfyUI VFI\)](https://github.com/Fannovel16/ComfyUI-Frame-Interpolation).
The models from [v1.0.2](https://github.com/dajes/frame-interpolation-pytorch/releases/tag/v1.0.2) distributed in the [Frame interpolation in PyTorch](https://github.com/dajes/frame-interpolation-pytorch) repository have an issue where they don't work on Apple Silicon Macs ([Runtime error: MPS Unsupported Border padding mode](https://github.com/dajes/frame-interpolation-pytorch/issues/4)).
These models have been re-exported using a [patch](https://github.com/dajes/frame-interpolation-pytorch/pull/6) that resolves this issue.
The models were created using the following steps:
1. Clone the original repository and install dependencies:
```
git clone https://github.com/dajes/frame-interpolation-pytorch.git
cd frame-interpolation-pytorch
uv venv
uv pip install -r requirements.txt
uv pip install tensorflow
```
2. Download model files distributed by [FILM: Frame Interpolation for Large Motion](https://github.com/google-research/frame-interpolation) from [Google Drive](https://drive.google.com/drive/folders/153dvxVSAcsNv1cyHVJySYZ-Twchm4Jdi) and place them in the repository root:
```
tree
.
β”œβ”€β”€ export.py
β”œβ”€β”€ feature_extractor.py
β”œβ”€β”€ fusion.py
β”œβ”€β”€ inference.py
β”œβ”€β”€ interpolator.py
β”œβ”€β”€ LICENSE
β”œβ”€β”€ photos
β”‚Β Β  β”œβ”€β”€ one.png
β”‚Β Β  β”œβ”€β”€ output.gif
β”‚Β Β  └── two.png
β”œβ”€β”€ pyramid_flow_estimator.py
β”œβ”€β”€ README.md
β”œβ”€β”€ requirements.txt
β”œβ”€β”€ saved_model
β”‚Β Β  β”œβ”€β”€ assets
β”‚Β Β  β”œβ”€β”€ keras_metadata.pb
β”‚Β Β  β”œβ”€β”€ saved_model.pb
β”‚Β Β  └── variables
β”‚Β Β  β”œβ”€β”€ variables.data-00000-of-00001
β”‚Β Β  └── variables.index
└── util.py
```
3. Apply the [patch](https://github.com/dajes/frame-interpolation-pytorch/pull/6):
```
curl -L https://github.com/dajes/frame-interpolation-pytorch/pull/6.patch -o PR6.patch
git apply PR6.patch
```
4. Export the models:
```
uv run export.py ./saved_model film_net_fp16.pt
uv run export.py ./saved_model film_net_fp32.pt --fp32
```
## Citation
In accordance with the requirements of the original repositories, if you use these model files, please cite the following:
```
@inproceedings{reda2022film,
title = {FILM: Frame Interpolation for Large Motion},
author = {Fitsum Reda and Janne Kontkanen and Eric Tabellion and Deqing Sun and Caroline Pantofaru and Brian Curless},
booktitle = {European Conference on Computer Vision (ECCV)},
year = {2022}
}
```
```
@misc{film-tf,
title = {Tensorflow 2 Implementation of "FILM: Frame Interpolation for Large Motion"},
author = {Fitsum Reda and Janne Kontkanen and Eric Tabellion and Deqing Sun and Caroline Pantofaru and Brian Curless},
year = {2022},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/google-research/frame-interpolation}}
}
```
## License
Following the original repositories, [Frame Interpolation in PyTorch](https://github.com/dajes/frame-interpolation-pytorch) and [FILM: Frame Interpolation for Large Motion](https://github.com/google-research/frame-interpolation), this project is licensed under the Apache 2.0 License.