CSI-BERT2

The description is generated by Grok3.

Model Details

  • Model Name: CSI-BERT2

  • Model Type: BERT-inspired transformer for CSI prediction and classification

  • Version: 2.0

  • Release Date: August 2025

  • Developers: Zijian Zhao

  • Organization: SRIBD, SYSU

  • License: Apache License 2.0

  • Paper: CSI-BERT2: A BERT-inspired Framework for Efficient CSI Prediction and Classification in Wireless Communication and Sensing, IEEE Transactions on Mobile Computing (TMC), 2025

  • Citation:

    @ARTICLE{11278110,
      author={Zhao, Zijian and Meng, Fanyi and Lyu, Zhonghao and Li, Hang and Li, Xiaoyang and Zhu, Guangxu},
      journal={IEEE Transactions on Mobile Computing}, 
      title={CSI-BERT2: A BERT-inspired Framework for Efficient CSI Prediction and Classification in Wireless Communication and Sensing}, 
      year={2025},
      volume={},
      number={},
      pages={1-17},
      keywords={Wireless communication;Sensors;Wireless sensor networks;Predictive models;Wireless fidelity;Training;Adaptation models;Packet loss;Data models;OFDM;Channel statement information (CSI);CSI prediction;CSI classification;wireless communication;wireless sensing},
      doi={10.1109/TMC.2025.3640420}}
    
  • Contact: [email protected]

  • Repository: https://github.com/RS2002/CSI-BERT2

  • Previous Version: CSI-BERT

Model Description

CSI-BERT2 is an upgraded BERT-inspired transformer model for Channel State Information (CSI) prediction and classification in wireless communication and sensing. It improves upon CSI-BERT with optimized model and code structure, supporting tasks like CSI recovery, prediction, gesture recognition, fall detection, people identification, and people number estimation. The model processes CSI amplitude data and supports adversarial training with a GAN-based discriminator.

  • Architecture: BERT-based transformer with optional GAN discriminator
  • Input Format: CSI amplitude (batch_size, length, receiver_num * carrier_dim), attention mask (batch_size, length), optional timestamp (batch_size, length)
  • Output Format: Hidden states of dimension [batch_size, length, hidden_dim]
  • Hidden Size: 128
  • Training Objective: MLM pre-training with GAN (optional) and task-specific fine-tuning
  • Tasks Supported: CSI recovery, CSI prediction, CSI classification

Training Data

The model was trained on the following datasets:

  • Public Datasets:
    • WiGesture: Gesture recognition, people identification
    • WiFall: Action recognition, fall detection, people identification
  • Proposed Dataset:
    • WiCount: People number estimation
  • Data Structure:
    • Amplitude: (batch_size, length, receiver_num * carrier_dim)
    • Timestamp: (batch_size, length) (optional)
    • Label: (batch_size)
  • Note: Refer to CSI-BERT for data preparation details. Custom dataloaders may be needed for specific tasks.

Usage

Installation

git clone https://huggingface.co/RS2002/CSI-BERT2

Example Code

import torch
from model import CSI_BERT2
model = CSI_BERT2.from_pretrained("RS2002/CSI-BERT2")
csi = torch.rand((2, 100, 52))
time_stamp = torch.rand((2, 100))
attention_mask = torch.zeros((2, 100))
y = model(csi,time_stamp,attention_mask)
print(y.shape) # dim: [2,100,128] (batch_size,length,hidden_dim)
Downloads last month
43
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support