As depth imaging is integrated into more and more consumer devices, manufacturers have to tackle new challenges. Applica- tions such as computational bokeh and augmented reality require dense and precisely segmented depth maps to achieve good re- sults. Modern devices use a multitude of different technologies to estimate depth maps, such as time-of-flight sensors, stereoscopic cameras, structured light sensors, phase-detect pixels or a com- bination thereof. Therefore, there is a need to evaluate the quality of the depth maps, regardless of the technology used to produce them. The aim of our work is to propose an end-result evalua- tion method based on a single scene, using a specifically designed chart. We consider the depth maps embedded in the photographs, which are not visible to the user but are used by specialized soft- ware, in association with the RGB pictures. Some of the aspects considered are spatial alignment between RGB and depth, depth consistency, and robustness to texture variations. This work also provides a comparison of perceptual and automatic evaluations.
An indirect time-of-flight (ToF) CMOS image sensor has been designed with 4-tap 7 μm global shutter pixel in back-side illumination process. 15000 e- of high full-well capacity (FWC) per a tap of 3.5 μm pitch and 3.6 e- of read-noise has been realized by employing true correlated double sampling (CDS) structure with storage gates (SGs). Noble characteristics such as 86 % of demodulation contrast (DC) at 100MHz operation, 37 % of higher quantum efficiency (QE) and lower parasitic light sensitivity (PLS) at 940 nm have been achieved. As a result, the proposed ToF sensor shows depth noise less than 0.3 % with 940 nm illuminator in even long distance.