Back to articles
Regular Article
Volume: 70 | Article ID: 020501
Image
Concrete Crack Image Segmentation based on UC-Net
  DOI :  10.2352/J.ImagingSci.Technol.2026.70.2.020501  Published OnlineMarch 2026
Abstract
Abstract

The maintenance of critical transportation infrastructure such as roads, tunnels, and bridges depends heavily on timely and accurate detection of structural damage. Among various types of surface defects, concrete cracks are the most common and dangerous. However, traditional manual inspection methods are inefficient, labor-intensive, and pose safety risks. In this paper, the authors propose UC-Net, a novel semantic segmentation network specifically designed for concrete crack detection. The UC-Net builds upon the U-Net architecture and introduces a Coordinate Attention mechanism into skip connections, enabling the model to better capture long, narrow, and spatially scattered crack features while suppressing irrelevant background noise. This design addresses two key challenges: the small spatial proportion of cracks in high-resolution images and the interference from complex textures and lighting conditions. To validate the effectiveness of this approach, the authors conducted extensive experiments on a publicly available crack dataset and real-world images. Compared with existing CNN- and attention-based networks, UC-Net achieves superior performance, with improvements in IOU (up to 91.78%) and accuracy (89.33%). The results confirm that UC-Net provides a lightweight yet accurate solution for fine-grained crack segmentation, and it can serve as a practical tool in real-world infrastructure monitoring scenarios.

Subject Areas :
Views 17
Downloads 6
 articleview.views 17
 articleview.downloads 6
  Cite this article 

Yuhao Zhang, Beibei Li, "Concrete Crack Image Segmentation based on UC-Netin Journal of Imaging Science and Technology,  2026,  pp 1 - 8,  https://doi.org/10.2352/J.ImagingSci.Technol.2026.70.2.020501

 Copy citation
  Copyright statement 
Copyright © Society for Imaging Science and Technology 2026
  Article timeline 
  • received February 2025
  • accepted June 2025
  • PublishedMarch 2026

Preprint submitted to:
  Login or subscribe to view the content