SACNN: Spatial Adversarial Convolutional Neural Network for Textile Defect Detection
Research and development
Authors:
- Hou Wei
School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing, P. R. China - Tao Xian
Research Center of Precision Sensing and Control, Institute of Automation, Chinese Academy of Sciences, Beijing, P. R. China - Ma Wenzhi (j/w)
- Xu De
School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing, P. R. China
Nr DOI: 10.5604/01.3001.0014.3808
Full text | references | Abstract: Constructing textile defect detection systems is significant for quality control in industrial production, but it is costly and laborious to label sufficient detailed samples. This paper proposes a model called ‘spatial adversarial convolutional neural network’ which tries to solve the problem above by only using the image-level label. It consists of two parts: a feature extractor and feature competition. Firstly, a string of convolutional blocks is used as a feature extractor. After feature extraction, a maximum greedy feature competition is taken among features in the feature layer. The feature competition mechanism can lead the network to converge to the defect location. To evaluate this mechanism, experiments were carried on two datasets. As the training time increases, the model can spontaneously focus on the actual defective location, and is robust towards an unbalanced sample. The classification accuracy of the two datasets can reach more than 98%, and is comparable with the method of labelling samples in detail. Detection results show that defect location from the model is more compact and accurate than in the Grad-CAM method. Experiments show that our model has potential usage in defect detection in an industrial environment.
|
Tags:
textile defect detection, feature extraction, feature competition, CNN.
Citation:
Hou W, Tao X, Ma W, Xu D. SACNN: Spatial Adversarial Convolutional Neural Network for Textile Defect Detection. FIBRES & TEXTILES in Eastern Europe 2020; 28, 6(144): 127-133. DOI: 10.5604/01.3001.0014.3808
Published in issue no 6 (144) / 2020, pages 127–133.