Dynamic vision sensors are growing in popularity for Computer Vision and moving scenes: its output is a stream of events reflecting temporal lighting changes, instead of absolute values. One of its advantages is fast detection of events, which are asynchronously read as spikes. However, high event data throughput implies an increasing workload for the read-out. That can lead to data loss or to prohibitively large power consumption for constrained devices. This work presents a scheme to reduce data throughput by using near pixel pre-processing: less events codifying temporal change and intensity slope magnitude are generated. Our simulated example depicts a data throughput reduction down to 14 %, in the case of the most aggressive version of our approach.