Often the perspective of heritage science can seem opaque, with the perception of science being only for “scientists”. The fairly recent move to using the term “heritage science” is better known in Europe, but there is some confusion as to what this term encompasses. Heritage science refers to any of the multidisciplinary fields that contribute to the discovery, security, and preservation of a diverse range of cultural heritage materials. Many heritage collection items are complex multi-composites with convoluted preservation needs and degradation pathways. One of the areas that seems to be less well understood and appreciated is the capacity for heritage science to add new layers of knowledge to collection items, as well as the capacity for re-interpretation through this new information. Further areas of heritage science that support humanities include confirming provenance, the ability to link and reconnect separated collections, and the utilization of new technology to provide levels of security, an area of support greatly needed in the current environment for extensive trafficking of heritage.
The usability and accessibility of digitised archival data can be improved using deep learning solutions. In this paper, the authors present their work in developing a named entity recognition (NER) model for digitised archival data, specifically state authority documents. The entities for the model were chosen based on surveying different user groups. In addition to common entities, two new entities were created to identify businesses (FIBC) and archival documents (JON). The NER model was trained by fine-tuning an existing Finnish BERT model. The training data also included modern digitally born texts to achieve good performance with various types of inputs. The finished model performs fairly well with OCR-processed data, achieving an overall F1 score of 0.868, and particularly well with the new entities (F1 scores of 0.89 and 0.97 for JON and FIBC, respectively).
The aim of this work is to provide the cultural heritage community with a comprehensive hyperspectral image database of handwritten laboratory samples, including various writing inks commonly found in historical documents. The database contains 195 samples registered in the VNIR (400-1000 nm) and SWIR (900-1700 nm) spectral ranges, along with complete information about the ink recipes (components and concentrations used for each ink and mixture), and their corresponding Ground Truth images. The database is now publicly available as part of a bigger database related to the Hyperdoc project and can be used to perform different tasks. We present here one example: the classification of iron gall vs non-iron gall inks.
Displaying the past appearance of artworks by reversing degradation phenomena holds significant value for art historians, conservators, museum curators, educators, and the wider public, as it seeks to estimate the original artist intention. In this work, we aim to restore the past colors of a painting from documentary records done on reversal film photographs. The challenge with these photographs is that due to film-specific chromogenic processes, their colors are inaccurate with respect to the captured object. For this reason, we test the performance of four color correction methods in compensating for the color distortions inherent to each film type by using a dataset of reversal films of two color targets, X-Rite ColorChecker Digital SG and Coloraid IT-8. Furthermore, we apply the same method to detect changes due to aging and/or conservation treatments in the painting Junger Proletarier (1919) by Paul Klee, by comparing a color corrected film record from 1995 with a more recent digital capture of the painting from 2005. Our results indicate that the method which best accounts for the film chromogenic processes to reveal the actual colors of the photographed object is based on non-linear optimization using a neural network.
Since the advent of the Digital Intermediate (DI) and the Cineon system, motion picture film preservation and restoration practices overcame an enormous change derived from the possibility of digitizing and digitally restoring film materials. Today, film materials are scanned using mostly commercial film scanners, which process the frames into the Academy Color Encoding Specification (ACES) and present proprietary LUTs of negative-to-positive conversions, image enhancement, and color correction. The processing operated by scanner systems is not always openly available. The various digitization hardware and software can lead to different approaches and workflows in motion picture film preservation and restoration, resulting in inconsistency among archives and laboratories. This work presents an overview of the main approaches and systems used to digitize and encode motion picture film frames to explain these systems’ potentials and limits.
Multispectral Imaging has become an indispensable tool for Cultural Heritage materials and objects analyzing, documenting, and visualizing. This study delves into applying this technique to an 18th-century illuminated manuscript at the Minas Gerais Public Archive, Brazil. Currently undergoing restoration intervention at The Federal University of Minas Gerais, the manuscript exhibits faded writing due to moisture, and the application of multispectral imaging (UV, Visible, and IR) with seven different wavelengths proves highly effective in recovering lost information. The light source was provided from LED and Halogen Lamp, and the result shows a very clear text on the manuscript after the digital processing by ImageJ and PCA. The Minas Gerais Public Archive, established in 1895, plays a vital role in safeguarding the state's documentary and historical heritage. It was the first-time multispectral imaging was applied to cultural objects in Minas Gerais.
This paper will present an overview of a project to digitize the Library of Congress Hebrew Manuscripts collection, which spanned from 2021 through spring 2023. It will describe the historical/cultural importance and breadth of the collection, as well as the workflow and processes used to digitize and display the manuscripts.
The United States National Archives and Records Administration (NARA) has issued new regulations that establish standards for the digitization of US government records. The regulations are part of an effort to transition to a fully electronic government, and allow US federal agencies the authority to digitize and destroy source records and the electronic version become the recordkeeping copy. The specifications draw upon established international digitization standards such as ISO 19264, Metamorfoze, and FADGI guidelines. By adopting the image quality specifications found in ISO 19264 and the image analysis method described by FADGI, NARA has effectively defined the minimum requirements for a digital surrogate to serve as legal and evidentiary purpose as the source record. This paper presents the records management context of digitization, as well as discussing the quality management, documentation, image and metadata specifications, and validation requirements.
This paper will present the story of a collaborative project between the Imaging Department and the Paintings Conservation Department of the Metropolitan Museum of Art to use 3D imaging technology to restore missing and broken elements of an intricately carved giltwood frame from the late 18th century.