In this paper, a low cost, single camera, double mirror system that can be built in a desktop nail printer will be described. The usage of this system is to capture an image of a fingernail and to generate the 3D shape of the nail. The nail’s depth map will be estimated from this rendered 3D nail shape. The paper will describe the camera calibration process and explain the calibration theory for this proposed system. Then a 3D reconstruction method will be introduced, as well. Experimental results will be shown in the paper, which illustrate the accuracy of the system to handle the rendering task.
There are many test charts and software to determine the intrinsic geometric calibration of a camera including distortion. But all of these setups have a few problems in common. They are limited to finite object distances and require large test charts for calibrations at greater distances combined with powerful and uniform illumination. On production lines the workaround for this problem is often times the use of a relay lens which itself introduces geometric distortions and therefore inaccuracies that need to be compensated for. A solution to overcome these problems and limitations has originally been developed for space applications and has already become a common method for the calibration of satellite cameras. We have now turned the lab setup on an optical bench into a commercially available product that can be used for the calibration of a huge variety of cameras for different applications. This solution is based on a diffractive optical element (DOE) that gets illuminated by a plane wave generated with an expanded laser diode beam. In addition to the conventional methods the proposed one also provides the extrinsic orientation of the camera and therefore allows the adjustment of cameras to each other.
A practical approach to calibrate a fish-eye camera by using horizontal and vertical laser planes projected from a laser level is proposed. The approach does not need to use camera parameters, scene information, and calibration objects in the scene. Instead, only an off-the-shelf laser level is utilized to cast the laser planes toward the scene to generate image features for the calibration based on the principle of projective geometry. With proper alignment of laser level, smooth laser curves can be obtained in the fish-eye image, and the principal center of the camera can be found by intersecting two straight laser lines in the fish-eye image. Other curved laser lines can then be used to measure the calibration data for the correction of radial distortion. Experimental results demonstrate that satisfactory calibration can be achieved by using the proposed method.
Naturalistic driving studies typically utilize a variety of sensors, including radar, kinematic sensors, and video cameras. While the main objective of such sensors is typically safety focused, with a goal of recording accidents and near accidents for later review, the instrumentation provides a valuable resource for a variety of transportation research. Some applications, however, require additional processing to improve the utility of the data. In this work, we describe a computer vision procedure for calibrating front view cameras for the Second Strategic Highway Research Project. A longitudinal stability study of the estimated parameters across a small sample set of cameras is presented along with a proposed procedure for calibrating a larger number of cameras from the study. A simple use case is presented as one example of the utility of this work. Finally, we discuss plans for calibrating the complete set of approximately 3000 cameras from this study.
Abstract A dual camera setup is proposed, consisting of a fixed (stationary) camera and a pan‐tilt‐zoom (PTZ) camera, employed in an automatic video surveillance system. The PTZ camera is zoomed in on a selected point in the fixed camera view and it may automatically track a moving object. For this purpose, two camera spatial calibration procedures are proposed. The PTZ camera is calibrated in relation to the fixed camera image, using interpolated look-up tables for pan and tilt values. For the calibration of the fixed camera, an extension of the Tsai algorithm is proposed, based only on measurements of distances between calibration points. This procedure reduces the time needed to obtain the calibration set and improves calibration accuracy. An algorithm for calculating PTZ values required for tracking of a moving object with the PTZ camera is also presented. The performance of the proposed algorithms is evaluated using the measured data.