Benchmarking Microsaccade Recognition with Event Cameras: A Novel Dataset and Evaluation


Waseem Shariff (University of Galway), Timothy Hanley (University of Galway), Maciej Stec (University of Galway), Hossein Javidnia (University of Dublin), Peter Corcoran (University of Galway)
The 35th British Machine Vision Conference

Abstract

Microsaccades are small, involuntary eye movements essential for visual perception and neural processing. Traditional microsaccade research often relies on eye trackers and frame-based video analysis. While eye trackers offer high precision, they can be expensive and have limitations in scalability and temporal resolution compared to alternative methods. In contrast, event-based sensing offers a more efficient and precise alternative, as it captures high-resolution spatial and temporal information with minimal latency. This work introduces a pioneering event-based microsaccade dataset, designed to support efforts in studying small eye movement data within cognitive computing. Using Blender, we generate high-fidelity renderings of eye movement scenarios and simulate microsaccades with angular displacements ranging from 0.5° to 2.0°, divided into seven distinct classes. These sequences are simulated into event streams using v2e, preserving the temporal dynamics of real microsaccades. The event streams, with durations as small as 0.25 milliseconds and as large as 2.25 milliseconds, have such high temporal resolution, which prompted us to investigate whether SNNs could effectively detect and classify these movements. We evaluate the proposed dataset using Spiking-VGG11, Spiking-VGG13, and Spiking-VGG16, and further introduce Spiking-VGG16Flow - an optical flow enhanced variant - implemented with SpikingJelly. Across experiments, these models achieve an average accuracy of approximately 90\%, effectively classifying microsaccades based on angular displacement, irrespective of event count or duration. These results highlight the suitability of SNNs for fine motion classification and establish a benchmark for future research in event-based vision. To facilitate further work in neuromorphic computing and visual neuroscience, both the dataset and trained models will be released publicly.

Citation

@inproceedings{Shariff_2025_BMVC,
author    = {Waseem Shariff and Timothy Hanley and Maciej Stec and Hossein Javidnia and Peter Corcoran},
title     = {Benchmarking Microsaccade Recognition with Event Cameras: A Novel Dataset and Evaluation},
booktitle = {36th British Machine Vision Conference 2025, {BMVC} 2025, Sheffield, UK, November 24-27, 2025},
publisher = {BMVA},
year      = {2025},
url       = {https://bmva-archive.org.uk/bmvc/2025/assets/papers/Paper_288/paper.pdf}
}


Copyright © 2025 The British Machine Vision Association and Society for Pattern Recognition
The British Machine Vision Conference is organised by The British Machine Vision Association and Society for Pattern Recognition. The Association is a Company limited by guarantee, No.2543446, and a non-profit-making body, registered in England and Wales as Charity No.1002307 (Registered Office: Dept. of Computer Science, Durham University, South Road, Durham, DH1 3LE, UK).

Imprint | Data Protection