IJMTES – VHDL IMPLEMENTATION OF AMERICAN HAND SIGN RECOGNITION SYSTEM USING RADIAL BASIS FUNCTION NEURAL NETWORK

Journal Title : International Journal of Modern Trends in Engineering and Science

Author’s Name : Bagyalakshmi J | Dr R Brindha  unnamed

Volume 03 Issue 09 2016

ISSN no:  2348-3121

Page no: 36-41

Abstract – This paper presents hand sign recognition using radial basis function algorithm implemented in VHDL. Radial Basis Function Neural Network has a simple structure and a fast training process. The hand sign recognition system has 3 steps: 1) Image pre-processing, 2) Feature extraction and 3) Classification. An image is pre-processed and converted to feature vector which will be compared with the training set image feature vectors. The system was designed to recognize 24 American sign languages hand signs. The proposed ANN leads to 100% recognition rate for the training set. This hand posture recognition system is coded using VHDL and Matlab, simulated using ModelSim 10.0b and Matlab R2012. 

Keywords— Hand posture recognition, Radial Basis Function Neural Network (RBFNN), Self organizing Map (SOM), American Sign Language (ASL)

Reference

  1. Hiroomi Hikawa and KeishiKaida, ‘Novel FPGA implementation of hand sign recognition system with SOM-Hebb Classifier,’ IEEE Trans. Circuits & systems for video technology vol. 25, no. 1, pp. 153–166, Jan–Jun. 2015.
  2. H. Hikawa, H. Fujimura, and D. Sato, ‘Hand sign recognition algorithm for hardware implementation,” IEICE Trans. Inform. Syst., vol. J92-D, no. 3, pp. 405–416, 2009.
  3. D.C. Dhubkarya, Deepak Nagariya and RichaKapoor, ‘Implementation Of A Radial Basis Function Using VHDL’ GJCST, Vol.10, pp, 16-19, Sep 2010.
  4. T. Kohonen, Self-Organizing Maps (Information Sciences), vol. 30, 3rd ed. New York, NY, USA: Springer-Verlag, 2001.
  5. S. Oniga, A. Tisan, D. Mic, A. Buchman, and A. Vida-Ratiu, “Hand postures recognition system using artificial neural networks implemented in FPGA,” in Proc. 30th Int. Spring Seminar Electron. Technol., 2007, pp. 507–512.
  6. N. Gamage, K. Y. Chow, and R. Akmeliawati, “Static hand sign recognition using linear projection method,” in Proc. 4th Int. Conf. Auto. Robots Agents (ICARA), Feb. 2009, pp. 403–407.
  7. V. I. Pavlovic, R. Sharma, and T. S. Huang, “Visual interpretation of hand gestures for human-computer interaction: A review,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 19, no. 7, pp. 677–695, Jul. 1997.
  8. P. Garg, N. Aggarwal, and S. Sofat, “Vision based hand gesture recognition,” World Acad. Sci., Eng. Technol., vol. 49, no. 1, pp. 972–977, 2009.
  9.  V. Bonato, A. K. Sanches, M. M. Fernandes, J. M. P. Cardoso, E. D. V. Simoes, and E. Marques, “A real time gesture recognition system for mobile robots,” in Proc. Int. Conf. Informat. Control, Autom., Robot., 2004, pp. 207–214.
  10. F. C. Huang, S. Y. Huang, J. W. Ker, and Y. C. Chen, “High-performance SIFT hardware accelerator for real-time image feature extraction,” IEEE Trans. Circuits Syst. Video Technol., vol. 22, no. 3, pp. 340–351, Mar. 2012.
  11. A. J. Heap and D. C. Hogg, “Towards 3D hand tracking using a deformable model,” in Proc. 2nd Int. Conf. Autom. Face Gesture Recognit., Oct. 1996, pp. 140–145.
  12. Y. Wu, J. Y. Lin, and T. S. Huang, “Capturing natural hand articulation,” in Proc. 8th IEEE Int. Conf. Comput.Vis., vol. 2. Jul. 2001, pp. 426–432.
  13. B. Stenger, P. R. S. Mendonca, and R. Cipolla, “Model-based 3D tracking of an articulated hand,” in Proc. Brit. Mach. Vis. Conf. (BMVC), vol. 1. Sep. 2001, pp. 63–72.
Scroll Up