A Multi Resolution Method for Detecting Defects in Fabric Images

: This study proposes a novel technique for detecting defects in fabric image based on the features extracted using a new multi resolution analysis tool called Digital Curvelet Transform. The direction features of curvelet coefficients and texture features based on GLCM of curvelet coefficients act as the feature-sets for a k-nearest neighbor classifier. The validation tests on the developed algorithms were performed with images from TILDA’s Textile Texture Database. A comparative study between the GLCM-based, wavelet-based and the curvelet-based techniques has also been included. The high accuracy achieved by the proposed method suggests an efficient solution for fabric defect. Furthermore, the algorithm has good robustness to white noise. Note that, this study is the first documented attempt to explore the possibilities of a new multi resolution analysis tool called digital Curvelet Transform to address the problem of fabric defect.


INTRODUCTION
Textile fabrics constitute a large proportion of the total cost of production in garment manufacturing.Since a garment with a textile defects usually sells with a massive discount of 45-65% (Kumar, 2003).Hence, quality control of fabrics before garment manufacturing is essential to ensure the quality of finished products and to increase the efficiency of the manufacturing process.Currently, the quality inspection process (Shady et al., 2006) for textile fabrics is mainly performed manually.However, the reliability of manual inspection is limited by boredom and inattentiveness.Indeed, Sari-Sarraf and Goddard (1999) found that only about 70% of fabric defects could be detected by the most highly trained inspectors.
Numerous approaches were proposed to address the problem of detecting defects (Zhang and Bresee, 1995) in woven fabrics, which can be broadly categorized into three classes: statistical, spectral and model based (Meihong et al., 2009).These include wavelet neural network (Jianli and Baoqi, 2007), morphological filters (Mak et al., 2009), Fourier Transform, Gabor filters (Mak andPeng, 2006, 2008) and the Wavelet Transform (WT) (Serdaroglu et al., 2006(Serdaroglu et al., , 2007)).Because the wavelet transform have an optimal localization in both the spatial domain and the spatial-frequency domain (Ngan et al., 2005), several fabric defect detection techniques based on the approaches of wavelet and sub band decompositions have been proposed in recent years.However, the usual wavelet transforms have wavelets with primarily vertical, primarily horizontal and primarily diagonal orientation, therefore, it is not efficient in representing the curve edges of fabric defects which affects the effectiveness of fabric defect detection, while Curvelets as proposed by Candes and Donoho (2004), constitute a relatively new family of frames that are designed to represent edges and other singularities along curves much more efficiently than the traditional wavelet-based transforms (Wong et al., 2009).
In this paper, a novel defect detection scheme is proposed to facilitate automated inspection of woven fabrics.The proposed scheme consists of Curvelet Transform (CT), Gray-Level Co-occurrence Matrices (GLCM), texture analysis, and k-nearest neighbor.This study is organized as follows: firstly, we use the direct multiplication of curvelet transform data at adjacent scales to distinguish important edges from noise and accomplish the task of removing noise from signals.
Then, the proposed algorithm using CT and GLCM for fabric defect detection is introduced and experiment results are showed.Finally, conclusions are provided.
Fabric defect image enhancement: Curvelet transforms are multi resolution decompositions that can be used to analyze signals and images.They describe a signal by the power at each scale, direction and position.Edges can be located very effectively in the curvelet transform domain (Candes and Donohod, 2004) proposed two Fast Discrete Curvelet Transforms (FDCT).The first one digital transformation is based on Unequally-Spaced Fast Fourier Transforms (USFFT), another is based on the wrapping of specially selected Fourier samples (FDCT-WARPING).In this study, we adopted FDCT-WARPING as a result of its highest speed in curvelet transform implementation method at present.The curvelet transform of the function f is: where, φ j,l,k : The curvelet j, l, k : The parameters of scale, direction and position, respectively The input is f(t 1 , t 2 ), (0≤t 1 , t 2 , <n) in Cartesian coordinate system, the discrete form of curvelet transform is: Implementation of FDCT-WARPING in frequency domain is as follows: at each pair of scale and direction j, l in frequency domain; the result is: n 1, 0 and n 2, 0 are two initial positions of the window function [n 1 , n 2 ].L 1, j and L 2, j are relevant parameters of 2 j and 2 j/2 , respectively, and they are length and width components of window function support interval.
is wrapped around origin. Transform function , by 2D IFFT, yielding the discrete curvelet coefficients sets C D (j, l, k).
In this section, fabric images, which are gray images with 256×256 pixel, are decomposed into five scale curvelet coefficients using FDCT-WARPING (Candes and Donohod, 2004).These coefficients are shown as in Table 1.
In this section, we demonstrate five fabric images containing different defects and their curvelet coefficients at different scales in Fig. 1.Original image and curvelet coefficients As Fig. 1-x(b) shows, there are strong orientations in the curvelet coefficients images.The white parts in the images represent partial edges of the defects of original image in different orientations.Meanwhile, it means the significant curvelet coefficients of images.The low frequency (coarse scale) coefficients are stored at the center of the display.The Cartesian concentric coronae show the coefficients at different scales; the outer coronae correspond to higher frequencies.
There are four strips associated to each corona, corresponding to the four cardinal points; these are further subdivided in angular panels.Each panel represents coefficients at a specified scale and along the orientation suggested by the position of the panel.
Noise always arises from the acquiring of fabric image and it results in loss of fabric defect details and thus affects accurate feature extraction recognition.In this paper, we applied our noise filtering technique (Luo and Ni, 2010) before feature extraction.
Feature extraction: Haralick et al. (1973) first proposed GLCM for texture descriptions.It is still popular until today, because of its good performance.The GLCM is a second order statistics method which describes the spatial interrelationships of the grey tones in an image.It contains elements that are counts of the number of pixel pairs, which are separated by certain distance and at some angular direction.Typically, the GLCM is calculated in a small window, which scans the whole image.The texture feature will be associated with each pixel.
In our studies, GLCM is computed based on two parameters, which are the distance between the pixel pair d and their angular relation θ. θ is quantized in four directions (0, 45, 90 and 135°).For image I, defined square window N×N, brightness levels i and j , the non- normalized GLCM p ij are defined by: where, C{.} = 1if the argument is true and C{.} = 0 otherwise.The signs  and  in Eq. ( 6) mean that each pixel pair is counted twice: one forward and once backward in order to make the GLCM diagonal symmetric.For each direction, θ 0 and θ 1 are shown in Table 2.The procedures of feature extraction are as follows: Step 1: Scaling the grayscale values in curvelet transform coefficients into 8 levers and computing the GLCMs of curvelet coefficients at the first scale c{1}, calculating 16 texture features based on GLCM: Step 2: Calculating Averaged l1-norm of curvelet coefficients in 8 interval directions at the second scale c{2} and acquiring 8 texture features according to Eq. ( 11): Step 3: Calculating Averaged l1-norm of curvelet coefficients in 16 interval directions at the third scale c{3} and acquiring 16 texture features according to Eq. ( 11) Step 4: Calculating Averaged l1-norm of curvelet coefficients in 16 interval directions at the fourth scale c{4} and acquiring 16 texture features according to Eq. ( 11) Step 5: Calculating Averaged l1-norm of curvelet coefficients at the second scale c{5} and acquiring 1 texture feature.
So, a feature vector containing 57 components for each image can be extracted.

EXPERIMENTAL RESULTS AND ANLYSIS
Define abbreviations and acronyms the first time they are used in the text, even after they have been defined in the abstract.Do not use abbreviations in the title unless they are unavoidable.
Some fabric images were chosen on the Tilda database (Workgroup on Texture Analysis of DFG, 1996) and the image analysis was implemented on the Matlab platform.T s in noising filtering technique can be acquired by the statistics of minus of correlation coefficients in adjacent directions at the same scale, and c can be acquired by statistics of ratio of cor (S, 2N-1) and B 2 (S, N) (i, j) by many experiments.Finally, T s = 120, c = 15 at the fourth scale.Figure 2 show denoising results by Wavelet Soft Threshing (WT-ST), Wavelet Hard Threshing (WT-HT) and curvelet transform.From Fig. 2, it can be seen that the proposed algorithm achieved better restruction of edges of defects than WT-ST and WT-HT.What's more it can keep contrast of images much better than WT-ST and WT-HT.So, the proposed algorithm can distinguish the edge of defect and the noise nearby it, which established the good basis for the following defects detection.
Experiment 1: The proposed method is compared with GLCM-based and wavelet-based.We use wavelet transform to decompose each sub-block into three scales wavelet coefficients using 'db4' and calculate Averaged l1-norm of wavelet coefficients at each scale.Finally, WT feature vector with dimension of 10 are acquired.The detection accuracy is shown as in Table 3.
When there are not distortions on background of the images and defective images are visually distinctive (block 1 and block 4), all the methods can detect defects well, and results of detecting fabric defects with the proposed algorithm are more fine and smoother than that with GLCM or WT.When defects are not so sharp and there are distortions on background of the images (block 2 and block 3), the proposed method is robust enough to find them reasonably and to suppress the background noise, and the extracted defects are complete and clean.GLCM or WT seems to underestimate the defects and overestimate the noise in some defective images.
Experiment 2: In order to test the robustness to image noise of the method, white noise is added into images with variance σ of 10, 20, 30 and 40, respectively and scratch is detected.The detection results are shown as in Table 4. From Table 4, the proposed algorithm can achieve the accuracy of over 90% on the condition that σ is less than 30.

CONCLUSION
In this study, aiming at the varieties of shape, position and texture backgrounds of fabric image, we proposed a novel technique for detecting defects in fabric image combining Gray-Level Co-occurrence Matrix (GLCM) and Curvelet Transform (CT).The proposed method is on the basis that the curvelet transform coefficients at the same position, same direction and different scales are correlated.As far as the fabric defect image, the edges of defect and white Gaussian noise both corresponds to larger curvelet coefficients.The former coefficients has better succession, however, the latter coefficients decays fast.So, we use the correlation at adjacent scales distinguish defects from noise and detect important edges of defection.
To obtain more representative defect features, features from space and CT domain are extracted.Both GLCM and CT are applied on the image to form the feature sequence.The combination of GLCM and CT makes the defect feature more distinct.Upon GLCM and CT the eigenvector of defect is extracted, which reflects features of spatial transformation and frequency

Table 1 :
Structure of curvelet transform coefficients