Back to articles
ARCH2023 - Cultural Heritage Archiving Special Issue FastTrack
Volume: 0 | Article ID: 040407
Image
Toward Solving a Puzzle of Fragmented Archeological Textiles
  DOI :  10.2352/J.ImagingSci.Technol.2024.68.4.040407
Abstract
Abstract

Archeological textiles can provide invaluable insight into the past. However, they are often highly fragmented, and a puzzle has to be solved to re-assemble the object and recover the original motifs. Unlike common jigsaw puzzles, archeological fragments are highly damaged, and no correct solution to the puzzle is known. Although automatic puzzle solving has fascinated computer scientists for a long time, this work is one of the first attempts to apply modern machine learning solutions to archeological textile re-assembly. First and foremost, it is important to know which fragments belong to the same object. Therefore, features are extracted from digital images of textile fragments using color statistics, classical texture descriptors, and deep learning methods. These features are used to conduct clustering and identify similar fragments. Four different case studies with increasing complexity are discussed in this article: from well-preserved textiles with available ground truth to an actual open problem of Oseberg archeological tapestry with unknown solution. This work reveals significant knowledge gaps in current machine learning, which helps us to outline a future avenue toward more specialized application-specific models.

Subject Areas :
Views 0
Downloads 0
 articleview.views 0
 articleview.downloads 0
  Cite this article 

Davit Gigilashvili, Casper Fabian Gulbrandsen, Ha Thu Nguyen, Margrethe Havgar, Marianne Vedeler, Jon Yngve Hardeberg, "Toward Solving a Puzzle of Fragmented Archeological Textilesin Journal of Imaging Science and Technology,  2024,  pp 1 - 16,  https://doi.org/10.2352/J.ImagingSci.Technol.2024.68.4.040407

 Copy citation
  Copyright statement 
Copyright © Society for Imaging Science and Technology 2024
  Article timeline 
  • received October 2023
  • accepted May 2024

Preprint submitted to:
  Login or subscribe to view the content