Datasets:

ArXiv:
DOI:
License:

You need to agree to share your contact information to access this dataset

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this dataset content.


language: - en tags: - event-based-vision - eye-dataset - microsaccade - fixation - fixational-eye-movements size_categories: - 100K<n<1M license: cc-by-nc-4.0

C3I-SynMicrosaccade: Microsaccade-benchmark Dataset

The Microsaccade-benchmark Dataset is a high-resolution, event-based dataset designed for microsaccade detection, classification, and analysis, developed by researchers at the University of Galway. Microsaccades are small, involuntary eye movements that occur during visual fixation, playing a critical role in vision research, driver monitoring systems (DMS), neuromorphic vision, and eye-tracking applications. This dataset provides both raw rendered data (RGB images, annotations(yaw, pitch, roll)) and preprocessed training-ready .npy event streams to support the development of event-based neural networks and other algorithms for modeling fine-grained, high-temporal-resolution eye movement patterns.

This dataset was introduced in a paper accepted at BMVC 2025.

Microsaccade Event Example


Dataset Structure

The dataset is organized into two sections: RGB Microsaccade Sequences and Event Microsaccade Sequences.

1. RGB Microsaccade Sequences

Each microsaccade class contains 500 sequences, with each sequence consisting of a small series of rendered eye frames and their corresponding annotations (yaw, pitch, roll).

Directory Layout

RGB/
β”œβ”€β”€ 0.5_left/
β”‚   β”œβ”€β”€ saccade_0.npz
β”‚   β”œβ”€β”€ ...
β”‚   └── saccade_499.npz
β”œβ”€β”€ ...
β”œβ”€β”€ 2.0_left/
β”‚   β”œβ”€β”€ saccade_0.npz
β”‚   β”œβ”€β”€ ...
β”‚   └── saccade_499.npz
β”œβ”€β”€ 0.5_right/
β”‚   β”œβ”€β”€ saccade_0.npz
β”‚   β”œβ”€β”€ ...
β”‚   └── saccade_499.npz
β”œβ”€β”€ ...
└── 2.0_right/
    β”œβ”€β”€ saccade_0.npz
    β”œβ”€β”€ ...
    └── saccade_499.npz

See Figure \ref{fig:dir_rgb_microsaccades} for a schematic.


2. Event Microsaccade Sequences

Preprocessed .npy files organized into Train/Validation, and Test sets for left and right eyes. Each folder contains seven classes corresponding to microsaccade amplitudes from 0.5Β° to 2.0Β°, with 17500 (Train/Validation) and 300 (Test) samples in each.

Directory Layout

Events/
β”œβ”€β”€ train_and_validate/
β”‚   β”œβ”€β”€ 0.5_left/
β”‚   β”‚   β”œβ”€β”€ saccade_0_0_0.npy
β”‚   β”‚   β”œβ”€β”€ ...
β”‚   β”‚   └── saccade_499_4_4.npy
β”‚   β”œβ”€β”€ ...
β”‚   β”œβ”€β”€ 2.0_right/
β”‚   β”‚   β”œβ”€β”€ saccade_0_0_0.npy
β”‚   β”‚   β”œβ”€β”€ ...
β”‚   β”‚   └── saccade_499_4_4.npy
β”‚   └──left_eye_splits.txt (train-validation splits)
β”‚   └──right_eye_splits.txt (train-validation splits)
β”‚
└── Test/
    β”œβ”€β”€ 0.5_left/
    β”‚   β”œβ”€β”€ saccade_0_0_0.npy
    β”‚   β”œβ”€β”€ ...
    β”‚   └── saccade_59_4_4.npy
    β”œβ”€β”€ ...
    └── 2.0_right/
        β”œβ”€β”€ saccade_0_0_0.npy
        β”œβ”€β”€ ...
        └── saccade_59_4_4.npy

See Figure \ref{fig:dir_event_microsaccades} for a schematic.


.npy File Format

Each .npy file contains:

{
   [T, X, Y, P]   # Shape: [N, 4] 
}
Where:

T: Timestamp (seconds)
X: Pixel x-coordinate
Y: Pixel y-coordinate
P: Event polarity (1 = ON, 0 or -1 = OFF)

Dataset Overview

Property Value
Original resolution 800 Γ— 600
ROI resolution 440 Γ— 300 (center)
Total sequences 175,000
Left eye sequences 87,500
Right eye sequences 87,500
Number of classes 7

License: cc-by-nc-4.0

This dataset is licensed under the Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) license. This license allows for non-commercial use of the dataset, provided proper attribution is given to the authors. Adaptations and modifications are permitted for non-commercial purposes. For full details, please review the CC BY-NC 4.0 license and the license file included in the dataset. As the dataset is gated, you must accept the access conditions on Hugging Face to use it.

Citation

If you use this dataset in your research, please cite the following:

@dataset{microsaccade_dataset_2025,
  title={Microsaccade Recognition with Event Cameras: A Novel Dataset},
  author={Waseem Shariff and Timothy Hanley and Maciej Stec and Hossein Javidnia and Peter Corcoran},
  year={2025},
  doi={https://doi.org/10.57967/hf/6965},
publisher={Hugging Face},
  note={Presented at BMVC 2025}
}
@inproceedings{Shariff_2025_BMVC,
author    = {Waseem Shariff and Timothy Hanley and Maciej Stec and Hossein Javidnia and Peter Corcoran},
title     = {Benchmarking Microsaccade Recognition with Event Cameras: A Novel Dataset and Evaluation},
booktitle = {36th British Machine Vision Conference 2025, {BMVC} 2025, Sheffield, UK, November 24-27, 2025},
publisher = {BMVA},
year      = {2025},
url       = {https://bmva-archive.org.uk/bmvc/2025/assets/papers/Paper_288/paper.pdf}
}
@article{microsaccade_benchmarking_2025,
  title={Benchmarking Microsaccade Recognition with Event Cameras: A Novel Dataset and Evaluation},
  author={Shariff, Waseem and Hanley, Timothy and Stec, Maciej and Javidnia, Hossein and Corcoran, Peter},
  journal={arXiv preprint arXiv:2510.24231},
  year={2025}
}

Contact

For questions, updates, or issues, open a discussion or pull request in the Community tab.

Downloads last month
2