The usefulness of mobile devices has increased greatly in recent years allowing users to perform more tasks in daily life. Mobile devices and applications provide many benefits for users, perhaps most significantly is the increased access to point-of-use tools, navigation, and alert systems. This paper presents a prototype of a cross-platform mobile augmented reality (AR) system with the core purpose of finding a better means to keep the campus community secure and connected. The mobile AR System consists of four core functionalities – an events system, a policing system, a directory system, and a notification system. The events system keeps the community up-to-date on current events that are happening or will be happening on campus. The policing system allows the community to stay in arms reach of campus resources that will allow them to stay secure. The directory system serves as a one-stop-shop for campus resources, ensuring that staff, faculty, and students will have a convenient and efficient means of accessing pertinent information on the campus departments. The mobile augmented reality system includes integrated guided navigation system that users can use to get directions to various destinations on campus. The various destinations are different buildings and departments on campus. This mobile augmented reality application will assist the students and visitors on campus to efficiently navigate the campus as well as send alert and notifications in case of emergencies. This will allow campus police to respond to the emergencies in a quick and timely manner. The mobile AR system was designed using Unity Game Engine and Vuforia Engine for object detection and classification. Google Map API was integrated for GPS integration in order to provide location-based services. Our contribution lies in our approach to create a user specific customizable navigational and alert system in order to improve the safety of the users at their workplace. Specifically, the paper describes the design and implementation of the proposed mobile AR system and reports the results of the pilot study conducted to evaluate their perceived ease-of-use, and usability.
This research explores a fresh approach to the selection and weighting of classical image features for infrared object detection and target-like clutter rejection. Traditional statistical techniques are used to calculate individual features, while modern supervised machine learning techniques are used to rank-order the predictive-value of each feature. This paper describes the use of Decision Trees to determine which features have the highest value in prediction of the correct binary target/non-target class. This work is unique in that it is focused on infrared imagery and exploits interpretable machine learning techniques for the selection of hand-crafted features integrated into a pre-screening algorithm.
In this paper, we propose a video analytics system to identify the behavior of turkeys. Turkey behavior provides evidence to assess turkey welfare, which can be negatively impacted by uncomfortable ambient temperature and various diseases. In particular, healthy and sick turkeys behave differently in terms of the duration and frequency of activities such as eating, drinking, preening, and aggressive interactions. Our system incorporates recent advances in object detection and tracking to automate the process of identifying and analyzing turkey behavior captured by commercial grade cameras. We combine deep-learning and traditional image processing methods to address challenges in this practical agricultural problem. Our system also includes a web-based user interface to create visualization of automated analysis results. Together, we provide an improved tool for turkey researchers to assess turkey welfare without the time-consuming and labor-intensive manual inspection.