With the release of the Apple iPhone 12 pro in 2020, various features were integrated that make it attractive as a recording device for scene-related computer graphics pipelines. The captured Apple RAW images have a much higher dynamic range than the standard 8-bit images. Since a scene-based workflow naturally has an extended dynamic range (HDR), the Apple RAW recordings can be well integrated. Another feature is the Dolby Vision HDR recordings, which are primarily adapted to the respective display of the source device. However, these recordings can also be used in the CG workflow since at least the basic HLG transfer function is integrated. The iPhone12pro's two Laser scanners can produce complex 3D models and textures for the CG pipeline. On the one hand, there is a scanner on the back that is primarily intended for capturing the surroundings for AR purposes. On the other hand, there is another scanner on the front for facial recognition. In addition, external software can read out the scanning data for integration in 3D applications. To correctly integrate the iPhone12pro Apple RAW data into a scene-related workflow, two command-line-based software solutions can be used, among others: dcraw and rawtoaces. Dcraw offers the possibility to export RAW images directly to ACES2065-1. Unfortunately, the modifiers for the four RAW color channels to address the different white points are unavailable. Experimental test series are performed under controlled studio conditions to retrieve these modifier values. Subsequently, these RAW-derived images are imported into computer graphics pipelines of various CG software applications (SideFx Houdini, The Foundry Nuke, Autodesk Maya) with the help of OpenColorIO (OCIO) and ACES. Finally, it will be determined if they can improve the overall color quality. Dolby Vision content can be captured using the native Camera app on an iPhone 12. It captures HDR video using Dolby Vision Profile 8.4, which contains a cross-compatible HLG Rec.2020 base layer and Dolby Vision dynamic metadata. Only the HLG base layer is passed on when exporting the Dolby Vision iPhone video without the corresponding metadata. It is investigated whether the iPhone12 videos transferred this way can increase the quality of the computer graphics pipeline. The 3D Scanner App software controls the two integrated Laser Scanners. In addition, the software provides a large number of export formats. Therefore, integrating the OBJ-3D data into industry-standard software like Maya and Houdini is unproblematic. Unfortunately, the models and the corresponding UV map are more or less machine-readable. So, manually improving the 3D geometry (filling holes, refining the geometry, setting up new topology) is cumbersome and time-consuming. It is investigated if standard techniques like using the ZRemesher in ZBrush, applying Texture- and UV-Projection in Maya, and VEX-snippets in Houdini can assemble these models and textures for manual editing.
Eberhard Hasche, Oliver Karaschewski, Reiner Creutzburg, "iPhone12 imagery in scene-referred computer graphics pipelines" in Electronic Imaging, 2023, pp 350-1 - 350-14, https://doi.org/10.2352/EI.2023.35.3.MOBMU-350