This research explored the potential for ink-jet printing to replicate the coloration and finishing techniques of traditional denim fabric and standardized the reproduction and evaluation procedure. Although denim fabric is widely consumed and very popular, one drawback to denim is that the finishing and manufacturing processes are energy and water intensive and can cause environmental hazards as well as generation of pollution through water waste, particularly at the finishing stage. Textile ink-jet printing has the potential to replicate some of the coloration and finishing techniques of traditional denim fabric without negative environmental impacts. A two-phase research project was conducted. In Phase I (P1), an optimal standard production workflow for digital denim reproduction (including color and finishing effects) was established, and six different denim samples were reproduced based on the workflow. In Phase II, an expert visual assessment protocol was developed to evaluate the acceptance of the replicated digital denim. Twelve ink-jet printing, color science, and denim industry experts finished the assessment.
ICC.2:2017 is a revision to the next-generation colour management specification iccMAX that introduces new support for colour appearance processing. iccMAX includes a built-in colour appearance model IccCAM, together with a rich programming environment, and support for spectral data, material channel connections, BRDF and processing elements that make it possible to functionally encode any appearance model. ICC.2:2017 introduces many new capabilities, including the ability to provide environment variables which allow parameters such as image statistics or viewing conditions to be passed to the transform at run-time. ICC.2:2017 supports a wide range of colour appearance computations within the colour management workflow.
Though a color aperiodic, clustered-dot, halftoning (NPACMS-MP-CLU-DBS) algorithm can overcome the visible moire and rosette artifacts in conventional color halftoning methods, it still has some disadvantages, such as the color mismatch caused by the initial stage color management method, and texture artifacts caused by the concentric-ring cluster structure. In this paper, first, a new color gamut mapping method is used during the color management process, that is an image-dependent mapping method, which can make the most use of the printer color gamut, in order to reduce the color mismatch between the continuous-tone original and printed halftone images. Secondly, a new color, clustered-DBS halftoning algorithm with separated-cluster structure is developed. As a color halftoning method based on the clustered-DBS algorithm, not only it can overcome the visible moire and rosette artifacts, but also the separated-cluster structure is more stable, compared with the concentric-ring cluster structure. It can also reduce the texture artifacts significantly.
Color is an important aspect of the camera quality. Above all in a Visual Effects Pipeline (VFX) it is necessary to maintain a linear relationship of the pixel color in the recorded image to the original light of the scene throughout every step in the production pipeline. This means that the plate recorded by the camera is not permitted to be subject of changes in any way (,,do no harm to the plate"). Unfortunately most of the camera vendors are applying certain functions during the input step to the recorded RAW material, mostly to meet the needs of the display devices at the end of the pipeline. But they also are adding functions to establish a certain look the camera company is associated with. Maintaining a linear relationship to the light of the scene enables compositing artists and editors to combine imagery of varying sources (mostly cameras of different vendors). The Academy of Motion Picture Arts and Science (AMPAS) established an Academy Color Encoding System (ACES). To achieve a linear relationship to the light of the scene, Input Device Transforms (IDTs) for most of the digital film cameras have been provided recently. Unfortunately, such IDTs are not available for nearly all consumer and DSLR cameras. To add their imagery to the film production pipeline it is desirable to create convenient IDTs for such devices as well. The goal of this paper is to record the spectral distribution of a GreagMacbeth ColorChecker using a spectrometer and also photography it with a Canon EOS 5D Mark III camera under the same lighting conditions. The RAW image is then converted to ACES color spacers (ACES2065-1 or ACEScg) using industrial approved RAW converters. The positions of the patches of the ColorChecker in CIEYxy color space are then compared to the positions of the patches captured by the spectral device. As a result a tendency could be obtained if the camera can be used inside the AMPAS ACES workflow.
The RED film cameras are important for professional film productions. Therefore, the Academy of Motion Picture Arts and Science (AMPAS) inserted to their new Input Device Transforms (IDTs) in ACES 1.0.3 among others a couple of conversions from RED color spaces. These transforms are intended to establish a linear relationship of the recorded pixel color to the original light of the scene. To achieve this goal the IDTs are applied to the different RED color spaces, which are provided by the camera manufacturer. This results in a linearization of the recorded image colors. Following this concept the conversion should render comparable results for each color space with an acceptable deviation from the original light of the scene. In color science color space conversions need a documentation of the three main components of the individual color spaces: primaries, whitepoint and transfer function. For the RED colorspaces in question these parameters are only available for the REDWideGamutRGB color space, whereas the other color spaces are not documented. For this reason the behavior of the color conversion can only be tested, but not calculated. The goal of this paper is to compare the results of the color conversions of the main RED color spaces: REDWideGamutRGB, REDcolor4 and REDdragoncolor2. Additionally a conversion to the for television usage important color space REC2020 (ITU-R BT.2020) is added. The test setup contains a GretagMacbeth ColorChecker, which is recorded by a RED Scarlet-X camera and a mobile spectrometer (rgbphotonics Qmini). The latter captures the spectral distribution of the individual ColorChecker patches under the same lighting conditions. To compare the results, the recordings of both devices were converted to the CIE xy-chromaticity diagram using The Foundry Nuke11X. Additionally a comparison to reference data provided by the ACES document TB- 2014-004 is included. Finally it is reviewed if the ACES IDT concept is working for the RED Scarlet-X camera.
Colour gamuts can be described as a list of vertices and a list of triangular faces connecting these vertices. This method of encoding a colour gamut is convenient for both gamut mapping and gamut volume calculation. Particularly where the vertices describe a surface that is non-convex, as in most print processes, it can be difficult to obtain a face list that produces a connected and nonoverlapping surface. Methods for obtaining a face list from characterization data were evaluated using data from a wide range of printing processes, and it was found that defining a mesh and corresponding triangulation in CMYK space gave consistent results across all the data sets.
Accurate characterization (profiling) of a capture system is essential to have the system accurately reproduce the colors in a scene. ISO 17321 [1][2] describe two methods to achieve this calibration. One based on standard reflective targets (chart-based method) and the other on making accurate measurements of the cameras responsivity functions and the spectral power distribution of the deployed illuminant (spectral characterization). The most prominent of the two is the chart-based method for the reason that it involves a simple capture of an inexpensive, standard color pattern (e.g., Macbeth/Xrite Color Checker). However, the results obtained from this method are illuminant specific and are very sensitive to the technique used in the capture process. Lighting non-uniformity on the chart, incorrect framing, and flare can all erroneously affect the results. ISO also recommends a more robust technique, involving the measurement of the camera's responsivity and the spectral power distribution of the capture illuminant. Measurements of these features can require the use of expensive and sophisticated instruments such as monochromators and spectro-radiometers. Both methods involve tradeoffs in cost, ease of use, and most importantly in the accuracy of the final capture system characterization. The results obtained are very sensitive to the technique of capture and precision of measurements of the various parameters involved. The end-user is often left confused asking such questions as, What accuracy is needed in individual measurements?, What are the tradeoffs (particularly in color accuracy) in using the chart-based method vs. the spectral characterization based method?, also, How sensitive is the system to the various parameters? In this study, both of the ISO recommended techniques are utilized for camera calibration on a broad range of professional cameras and illuminants. Such characterization was conducted by approximately ten different users so as to capture the variability of the deployed capture technique. The collected data was used to calculate and quantify the system characterization accuracy using the color inconstancy index for a set of evaluation colors as the metric. Sensitivity analysis techniques were used to attempt to answer the question "How much of an advantage, if any, does the spectral characterization of the camera offer over the chartbased approach?" In answering the question, parameters (and their sensitivities) were identified to most influence the results.
Color is an important aspect of the camera quality. Above all in a Visual Effects Pipeline (VFX) it is necessary to maintain a linear relationship of the pixel color in the recorded image to the original light of the scene throughout every step in the production pipeline. This means that the plate recorded by the camera is not permitted to be subject of changes in any way (,,do no harm to the plate"). Unfortunately most of the camera vendors are applying certain functions during the input step to the recorded RAW material, mostly to meet the needs of the display devices at the end of the pipeline. But they also are adding functions to establish a certain look, the camera company is associated with.Maintaining a linear relationship to the light of the scene enables compositing artists and editors to combine imagery of varying sources (mostly cameras of different vendors). If for example an action scene is filmed using an ARRI film camera to capture the performance of the principal actors, additional imagery is derived using action cameras like the GoproHero. Also it is often desirable to have some less expensive camera at hand which can be moved around easily to take textures and imagery for example to create clean plates. A critical aspect in the production workflow is that all the imagery from the different sources can be combined easily in editing and compositing without additional color correction.The goal of this paper is to calculate the position of the patches of the GretagMacbeth color checker chart [1] (now X-Rite color chart) using an image recorded by the Blackmagic Production Camera and compare it to reference data sets based on those provided by the manufacturer and measured spectral data under the same lighting conditions. As a result a tendency could be obtained if the camera can be used inside the AMPAS ACES workflow.