Bilateral Reference for High-Resolution Dichotomous Image Segmentation
Peng Zheng 1,4,5,6,
Dehong Gao 2,
Deng-Ping Fan 1*,
Li Liu 3,
Jorma Laaksonen 4,
Wanli Ouyang 5,
Nicu Sebe 6
1 Nankai University 2 Northwestern Polytechnical University 3 National University of Defense Technology 4 Aalto University 5 Shanghai AI Laboratory 6 University of Trento
DIS-Sample_1 | DIS-Sample_2 |
---|---|
For more information, check out the official repository.
Usage (Transformers.js)
If you haven't already, you can install the Transformers.js JavaScript library from NPM using:
npm i @huggingface/transformers
You can then use the model for image matting, as follows:
import { AutoModel, AutoProcessor, RawImage } from '@huggingface/transformers';
// Load model and processor
const model_id = 'onnx-community/BiRefNet_lite';
const model = await AutoModel.from_pretrained(model_id, { dtype: 'fp32' });
const processor = await AutoProcessor.from_pretrained(model_id);
// Load image from URL
const url = 'https://images.pexels.com/photos/5965592/pexels-photo-5965592.jpeg?auto=compress&cs=tinysrgb&w=1024';
const image = await RawImage.fromURL(url);
// Pre-process image
const { pixel_values } = await processor(image);
// Predict alpha matte
const { output_image } = await model({ input_image: pixel_values });
// Save output mask
const mask = await RawImage.fromTensor(output_image[0].sigmoid().mul(255).to('uint8')).resize(image.width, image.height);
mask.save('mask.png');
Citation
@article{BiRefNet,
title={Bilateral Reference for High-Resolution Dichotomous Image Segmentation},
author={Zheng, Peng and Gao, Dehong and Fan, Deng-Ping and Liu, Li and Laaksonen, Jorma and Ouyang, Wanli and Sebe, Nicu},
journal={CAAI Artificial Intelligence Research},
year={2024}
}
Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using 🤗 Optimum and structuring your repo like this one (with ONNX weights located in a subfolder named onnx
).
- Downloads last month
- 35
Inference API (serverless) does not yet support transformers.js models for this pipeline type.