1、Introduction to Sparse Representation and Some Applications on Texture Classification Xie JinOutlinen Sparse representation of signalsn Texture classification via sparse texton learningn Texton encoding induced statistical features for texture classificationnOpen problemsSparse representation of sig
2、nalsn Signal processing tools:p FT(Fourier transform)pSTFT(Short time Fourier transform)p Wavelet transform -Optimal for 1D signal -Not optimal for 2D signal(images):not adaptive to the content in the imagep Curevelet,countourlet and bandlet transform:Construct better transform bases to represent im
3、agesSparse representation of signalsSparse representation of signalsSparse representation of signalsSparse representation of signalsSparse representation of signalsSparse representation of signalsp Solving L0 optimization is an NP-hard problem.l Greedy methods:Matching Pursuit,Orthogonal Matching Pu
4、rsuit(Mallat 93).l Relaxation methods:In some cases,solving L0 optimization can be equal to solve L1 optimization(Candes,Terrence Tao,Donoho,2004):p Algorithms:l1 magic,l1-ls,homotopy and augmented Lagrangian method Allan yang et al:A Review of Fast 11-Minimization Algorithms for Robust Face Recogni
5、tion(SIAM,Review)Compressive sensingTexture classification with dictionary learningTexture classification with dictionary learningTexture classification with dictionary learningTexture classification with dictionary learningTexture classification via sparse texton learningn MotivationTexture classif
6、ication via sparse texton learningpThe K-means clustering algorithm is based on the l2-norm Euclidean distance so that the elements of a cluster will have a ball-like distribution.The learned K ball-like clusters,nonetheless,may not be able to characterize reasonably well the intrinsic feature space
7、 of the texture images.p The K-means clustering method can be viewed as a special case of the Sparse Coding(SC).SC can achieve a much lower reconstruction error due to the less restrictive constraint.Texture classification via sparse texton learningn Idea:We can use the sparse coding to replace k-me
8、ans to learn a texture dictionary.And extract texture features from the learned dictionary.nPre-processing steps:p All texture images are converted to grey level images and are normalized to have zero mean and unit standard deviation.p A square neighborhood around each pixel in the image is taken an
9、d a vector is formed along the row.p Patch vectors are contrast normalized using the Webers law.Texture classification via sparse texton learningTexture classification via sparse texton learningn Use textons to represent texture:Texture classification via sparse texton learningTexture classification
10、 via sparse texton learningnExperimental results on the CUReT texture databasepThe CUReT texture database contains 61 classes,each consisting of 92 images.pIt has both large inter-class confusion and intra-class variation.pThe images of a class are obtained under the unknown viewpoint and illuminati
11、on changes,and some different classes look similar in appearance.Figure(a)and(b)show two different kinds of textures while the images appear similar.Texture classification via sparse texton learningTexture classification via sparse texton learningTexture classification via sparse texton learning1.Mu
12、lti-scale estimation using dictionary learning Current dictionary learning methods lack the multi-scale property.Boaz ophir,Michael Lusting and Michael Elad.Multi-scale dictionary learning using wavelet.IEEE Trans.Selected topics in signal processing.Ongoing topics:p Optical flow using the sparse learned model p Dynamic texture modeling based on dictionary learning Thank you for your attention