Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

EventDrop: data augmentation for event-based learning

About

The advantages of event-sensing over conventional sensors (e.g., higher dynamic range, lower time latency, and lower power consumption) have spurred research into machine learning for event data. Unsurprisingly, deep learning has emerged as a competitive methodology for learning with event sensors; in typical setups, discrete and asynchronous events are first converted into frame-like tensors on which standard deep networks can be applied. However, over-fitting remains a challenge, particularly since event datasets remain small relative to conventional datasets (e.g., ImageNet). In this paper, we introduce EventDrop, a new method for augmenting asynchronous event data to improve the generalization of deep models. By dropping events selected with various strategies, we are able to increase the diversity of training data (e.g., to simulate various levels of occlusion). From a practical perspective, EventDrop is simple to implement and computationally low-cost. Experiments on two event datasets (N-Caltech101 and N-Cars) demonstrate that EventDrop can significantly improve the generalization performance across a variety of deep networks.

Fuqiang Gu, Weicong Sng, Xuke Hu, Fangwen Yu• 2021

Related benchmarks

TaskDatasetResultRank
ClassificationCIFAR10-DVS
Accuracy77.73
133
object recognitionN-Caltech101
Accuracy74.04
51
Action RecognitionSL-Animals 4Sets
Accuracy86.33
15
Action RecognitionSL-Animals 3Sets
Accuracy88.99
13
Action RecognitionDVSGesture (full)
Accuracy92.33
11
object recognitionN-Cars
Accuracy95.46
7
ClassificationN-ImageNet mini neuromorphic adaptation (test)
Top-1 Acc34.18
4
Showing 7 of 7 rows

Other info

Follow for update