IJMTES – IMPROVING THE TEXTURE REGULARITY METRIC BASED ON CONTRAST ADJUSTMENT METHOD

Journal Title : International Journal of Modern Trends in Engineering and Science

Paper Title : IMPROVING THE TEXTURE REGULARITY METRIC BASED ON CONTRAST ADJUSTMENT METHOD

Author’s Name : J Vinothini | V Bakyalakshmi  unnamed

Volume 03 Issue 07 2016

ISSN no:  2348-3121

Page no: 217-219

Abstract – In this paper, to detect the defects in textures and improves the quality of image. They are three algorithm is used to identified the defects in the textures. The three algorithms are neighbor embedding, super resolution and edge detection with sobal filter. The neighbor embedding is comparing the pixels and find which one is defect. The super resolution is improves the quality of an image. The edge detection is detecting meaningful discontinuity in gray level. First and second order digital derivatives are implemented to detect the edges in an image.      

Keywords— Ceramic Tiles, Textiles, Resolution, Quality, Accuracy of Texture Image  

Reference

  1. Do and M. Vetterli, “Wavelet-based texture retrieval using generalized Gaussian density and Kullback–Leibler distance,” IEEE Trans. Image Process., vol. 11, no. 2, pp. 146–158, Feb. 2002.
  2. H. Y. T. Ngan and G. K. H. Pang, “Regularity analysis for patterned texture inspection,” IEEE Trans. Autom. Sci. Eng., vol. 6, no. 1, pp. 131–144, Jan. 2009.
  3. S. Varadarajan and L. J. Karam, “Adaptive texture synthesis based on perceived texture regularity,” in Proc. 6th Int. Workshop Quality Multimedia Exper., Sep. 2014, pp. 76–80.
  4. S. Varadarajan and L. J. Karam, “Effect of texture regularity on perceptual quality of compressed textures,” in Proc. Int. Workshop Video Process. Quality Metrics Consum. Electron., Jan. 2014. [Online].
  5. S. Varadarajan and L. J. Karam, “A reduced-reference perceptual quality metric for texture synthesis,” in Proc. IEEE Int. Conf. Image Process., Oct. 2014, pp. 531–535. [7] R. M. Haralick, “Statistical and structural approaches to texture,” Proc. IEEE, vol. 67, no. 5, pp. 786–804, May 1979.
  6. V. V. Starovoitov, S.-Y. Jeong, and R.-H. Park, “Texture periodicity detection: Features, properties, and comparisons,” IEEE Trans. Syst., Man, Cybern. A, Syst., Humans, vol. 28, no. 6, pp. 839–849, Nov. 1998.
  7. F. Liu and R. W. Picard, “Periodicity, directionality, and randomness: Wold features for image modeling and retrieval,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 18, no. 7, pp. 722–733, Jul. 1996.
  8. D. Chetverikov, “Pattern regularity as a visual key,” in Proc. BMVC, 1998, pp. 3.1–3.10.
  9. D. Chetverikov, “Pattern regularity as a visual key,” Image Vis. Comput., vol. 18, no. 2, pp. 975–985, 2000.
  10. A. M. Atto and Y. Berthoumieu, “How to perform texture recognition from stochastic modeling in the wavelet domain,” in Proc. IEEE Int. Conf. Acoust., Speech, Signal Process., May 2011, pp. 4320–4323.
  11. X. Liu and D. Wang, “Texture classification using spectral histograms,” IEEE Trans. Image Process., vol. 12, no. 6, pp. 661–670, Jun. 2003.2796 IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 24, NO. 9, SEPTEMBER 2015
  12. J. Zujovic, T. N. Pappas, and D. L. Neuhoff, “Structural similarity metrics for texture analysis and retrieval,” in Proc. 16th IEEE Int. Conf. Image Process., Nov. 2009, pp. 2225–2228.
  13. M. Mancas, C. Mancas-Thillou, B. Gosselin, and B. Macq, “A rarity- based visual attention map—Application to texture description,” in Proc. IEEE Int. Conf. Image Process., Oct. 2006, pp. 445–448.
  14. A. Borji, D. N. Sihite, and L. Itti, “Quantitative analysis of human- model agreement in visual saliency modeling: A comparative study,” IEEE Trans. Image Process., vol. 22, no. 1, pp. 55–69, Jan. 2013.
  15. S. Varadarajan and L. J. Karam, “A no-reference perceptual texture regularity metric,” in Proc. IEEE Int. Conf. Acoust., Speech, Signal Process., May 2013, pp. 1894–1898.
  16. Methodology for the Subjective Assessment of the Quality of Television Pictures, document ITU Rec. BT.500-11, 2002.
  17. D. D. Salvucci and J. H. Goldberg, “Identifying fixations and saccades in eye-tracking protocols,” in Proc. Symp. Eye Tracking Res. Appl., 2000, pp. 71–78.
  18. O. V. Komogortsev and A. Karpov, “Automated classification and scoring of smooth pursuit eye movements in the presence of fixations and saccades,” Behavior Res. Methods, vol. 45, no. 1, pp. 203–215, 2013.
  19. L. Zhang, M. H. Tong, T. K. Marks, H. Shan, and G. W. Cottrell, “SUN: A Bayesian framework for saliency using natural statistics,” J. Vis., vol. 8, no. 7, 2008, Art. ID 32.
  20. J. Harel, C. Koch, and P. Perona, “Graph-based visual saliency,” in Advances in Neural Information Processing Systems, vol. 19. Cambridge, MA, USA: MIT Press, 2007, pp. 545–552.
  21. U. Rajashekar, I. van der Linde, A. C. Bovik, and L. K. Cormack, “GAFFE: A gaze-attentive fixation finding engine,” IEEE Trans. Image Process., vol. 17, no. 4, pp. 564–573, Apr. 2008.
  22. A. Garcia-Diaz, X. R. Fdez-Vidal, X. M. Pardo, and R. Dosil, “Saliency from hierarchical adaptation through decorrelation and variance normalization,” Image Vis. Comput., vol. 30, no. 1, pp. 51–64, 2012.
  23. N. D. B. Bruce and J. K. Tsotsos, “Saliency, attention, and visual search: An information theoretic approach,” J. Vis., vol. 9, no. 3, 2009, Art. ID 5