Back to articles
Regular Articles
Volume: 64 | Article ID: jist0959
Image
Limitations of CNNs for Approximating the Ideal Observer Despite Quantity of Training Data or Depth of Network
  DOI :  10.2352/J.ImagingSci.Technol.2020.64.6.060408  Published OnlineNovember 2020
Abstract
Abstract

The performance of a convolutional neural network (CNN) on an image texture detection task as a function of linear image processing and the number of training images is investigated. Performance is quantified by the area under (AUC) the receiver operating characteristic (ROC) curve. The Ideal Observer (IO) maximizes AUC but depends on high-dimensional image likelihoods. In many cases, the CNN performance can approximate the IO performance. This work demonstrates counterexamples where a full-rank linear transform degrades the CNN performance below the IO in the limit of large quantities of training data and network layers. A subsequent linear transform changes the images’ correlation structure, improves the AUC, and again demonstrates the CNN dependence on linear processing. Compression strictly decreases or maintains the IO detection performance while compression can increase the CNN performance especially for small quantities of training data. Results indicate an optimal compression ratio for the CNN based on task difficulty, compression method, and number of training images.

Subject Areas :
Views 82
Downloads 7
 articleview.views 82
 articleview.downloads 7
  Cite this article 

Khalid Omer, Luca Caucci, Meredith Kupinski, "Limitations of CNNs for Approximating the Ideal Observer Despite Quantity of Training Data or Depth of Networkin Journal of Imaging Science and Technology,  2020,  pp 060408-1 - 060408-11,  https://doi.org/10.2352/J.ImagingSci.Technol.2020.64.6.060408

 Copy citation
  Copyright statement 
Copyright © Society for Imaging Science and Technology 2020
  Article timeline 
  • received July 2020
  • accepted November 2020
  • PublishedNovember 2020

Preprint submitted to:
  Login or subscribe to view the content