Tracking a large number of small, similar, high-speed/- agility targets is a challenging problem for current tracking systems that make use of traditional visual sensors. Such targets necessitate very high tracker update rates to keep up with meandering and mutually-occluding target
paths. Event-based vision sensors may offer a solution to this problem as they report only “event-based” pixelwise changes in intensity as they occur with a time resolution approaching 1 μs [1], providing data that is much sparser and higher in time-resolution than traditional
vision systems. However, this class of sensor presents unique challenges; for example, a single object in the sensor’s field of view may produce multiple synchronous or nearly-synchronous events. In addition, performing direct measurement-to-track association for event data on ms to
ms timescales introduces problematic computational burdens for scenarios involving large numbers of targets.
The work described in this paper is twofold. We first define and apply an event-clustering procedure to raw events to reduce the amount of data passed into the tracker. This
transformation from events to event-clusters provides a) discrimination between event-clusters that correspond to true targets and those that do not and b) reduction in tracking computation time. Second, we define and apply a partial-update Gaussian mixture probability hypothesis density (GMPHD)
filter [2] for tracking using event-cluster data. We demonstrate increased computational performance over the standard GMPHD filter while achieving comparable tracking performance per the optimal sub-pattern assignment (OSPA) metric [3].