Unmanned Aerial Vehicles (UAVs) gain popularity in a wide range of civilian and military applications. Such emerging interest is pushing the development of effective collision avoidance systems which are especially crucial in a crowded airspace setting. Because of cost and weight limitations associated with UAVs' payload, the optical sensors, simply digital cameras, are widely used for collision avoidance systems in UAVs. This requires moving object detection and tracking algorithms from a video, which can be run on board efficiently. In this paper, we present a new approach to detect and track UAVs from a single camera mounted on a different UAV. Initially, we estimate background motions via a perspective transformation model and then identify moving object candidates in the background subtracted image through deep learning classifier trained on manually labeled datasets. For each moving object candidates, we find spatio-temporal traits through optical flow matching and then prune them based on their motion patterns compared with the background. Kalman filter is applied on pruned moving objects to improve temporal consistency among the candidate detections. The algorithm was validated on video datasets taken from a UAV. Results demonstrate that our algorithm can effectively detect and track small UAVs with limited computing resources.
ERL Emergency is an outdoor multi-domain robotic competition inspired by the 2011 Fukushima accident. The ERL Emergency Challenge requires teams of land, underwater and flying robots to work together to survey the scene, collect environmental data, and identify critical hazards. To prepare teams for this multidisciplinary task a series of summer schools and workshops have been arranged. In this paper the challenges and hands-on results of bringing students and researchers collaborating successfully in unknown environments and in new research areas are explained. As a case study results from the euRathlon/SHERPA workshop 2015 in Oulu are given.