Current techniques for identifying the presence of cyanobacteria in a given water sample are cumbersome. This project is an attempt to simplify the process by using image capture with smartphones. Experiments were designed to ascertain if it is possible to detect cyanobacteria present in a water sample based on measurements of color and transmission spectra. Four types of organisms were used in the experiment. A colonial and a filamentous variant of cyanobacteria and of green algae were measured and compared. In these tests, the results from the four smartphones followed the same trends. All four smartphones displayed a linearity in the relationship between the C* values measured by the spectrophotometer vs. the C* values captured by the smartphone cameras for both types of cyanobacteria and for the colonial green algae. The behavior of the filamentous green algae differed from the behavior of the other organisms and presented an S-shaped curve when comparing the C* values from the spectrophotometer and camera. Each smartphone was able to capture this strange behavior, lending hope that it may be possible to successfully use smartphone cameras for the purpose of detection with further work. Only four smartphones were tested, so more would need to be tested to make greater generalizations about the use of this technique.
Various image editing tools make our pictures more attractive, and at the same time, evoke different emotional responses. With powerful and easy-to-use imaging applications, capturing, editing and then sharing pictures have become daily life for many. This paper investigates the influence of several image manipulations on evoked emotions for different types of images. To do so, various types of images clustered in different categories, were collected from Instagram and subjective evaluations were conducted via crowdsourcing to gather the emotional responses on different manipulations as perceived by subjects. Evaluation results show that certain image manipulations can induce different evoked emotions on transformed pictures when compared to the original ones. However, such changes in image emotions due to manipulation are highly content dependent. Then, we conducted a machine learning based experiment, in attempt to predict the emotions of a manipulated image given its original version and the desired manipulation method. Experimental results present a promising performance of such a prediction model, which could pave the road to automatic selection or recommendation of image editing tools that can efficiently transform or emphasize desired emotions in pictures.