YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

CSI-BERT

The description is generated by Grok3.

Model Details

  • Model Name: CSI-BERT

  • Model Type: Transformer-based model for wireless sensing and data recovery

  • Version: 1.0

  • Release Date: August 2025

  • Developers: Zijian Zhao

  • Organization: SRIBD, SYSU

  • License: Apache License 2.0

  • Paper: Finding the Missing Data: A BERT-Inspired Approach Against Package Loss in Wireless Sensing, IEEE INFOCOM DeepWireless Workshop 2024

  • Arxiv: https://arxiv.org/abs/2403.12400

  • Citation:

    @INPROCEEDINGS{10620769,
      author={Zhao, Zijian and Chen, Tingwei and Meng, Fanyi and Li, Hang and Li, Xiaoyang and Zhu, Guangxu},
      booktitle={IEEE INFOCOM 2024 - IEEE Conference on Computer Communications Workshops (INFOCOM WKSHPS)},
      title={Finding the Missing Data: A BERT-Inspired Approach Against Package Loss in Wireless Sensing},
      year={2024},
      volume={},
      number={},
      pages={1-6},
      doi={10.1109/INFOCOMWKSHPS61880.2024.10620769}
    }
    
  • Contact: [email protected]

  • Repository: https://github.com/RS2002/CSI-BERT

  • Updated Version: CSI-BERT2

Model Description

CSI-BERT is a BERT-inspired transformer model designed for wireless sensing, specifically to address packet loss in Channel State Information (CSI) data. It processes amplitude and timestamp data to recover missing information and supports downstream tasks like action and people classification. The model can operate with or without time and position embeddings, making it flexible for various wireless sensing applications. Note that phase information is not used in the current version due to its negative impact on downstream task performance, though the model can recover phase data if needed.

  • Architecture: BERT-based transformer
  • Input Format: CSI amplitude (batch_size, length, receiver_num * carrier_dim), timestamp (batch_size, length), attention mask (batch_size, length)
  • Output Format: Hidden states of dimension [batch_size, length, 64]
  • Hidden Size: 64
  • Training Objective: MLM pre-training for data recovery, followed by task-specific fine-tuning
  • Tasks Supported: CSI data recovery, CSI classification

Training Data

The model was trained on the dynamic part of the WiGesture Dataset:

  • Dataset Source: WiGesture Dataset
  • Data Structure:
    • Amplitude: (batch_size, length, receiver_num * carrier_dim)
    • Timestamp: (batch_size, length)
    • Label: (batch_size)
  • Note: Phase information is not used in the current version but can be concatenated to amplitude data if needed. Custom dataloaders are required for user-specific tasks.

Usage

Installation

git clone https://huggingface.co/RS2002/CSI-BERT

Example Code

import torch
from model import CSI_BERT

# Load the model
model = CSI_BERT.from_pretrained("RS2002/CSI-BERT")

# Example input
csi = torch.rand((2, 100, 52))
time_stamp = torch.rand((2, 100))
attention_mask = torch.zeros((2, 100))

# Forward pass
y = model(csi, attention_mask, time_stamp)
print(y.shape)  # Output: [2, 100, 64]
Downloads last month
-
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support