IJMTES – VIRTUAL TOUCH SCREEN “VIRTOS”IMPLEMENTING VIRTUAL TOUCH BUTTONS TO CONTROL INDUSTRIAL MACHINES

Journal Title : International Journal of Modern Trends in Engineering and Science

Paper Title : VIRTUAL TOUCH SCREEN “VIRTOS” IMPLEMENTING VIRTUAL TOUCH BUTTONS TO CONTROL INDUSTRIAL MACHINES

Author’s Name : Pavithra R | Pavithra T | Poovitha D | Shridineshraj A R
unnamed

Volume 04 Issue 03 2017

ISSN no:  2348-3121

Page no: 229-235

Abstract –We propose a large interactive display with virtual touch buttons on a pale-colored flat wall. Our easy-to-install system consists of a front projector and a single commodity camera. A button touch is detected based on the area of the shadow cast by the user’s hand; this shadow becomes very small when the button is touched. The shadow area is segmented by a brief change of the button to a different color when the shadow covers the button region. Background subtraction is used to extract the foreground (i.e. the hand and its shadow) region. The reference image for the background is continuously adjusted to match the ambient light. When tested, our scheme proved robust to differences in illumination. The response time for touch detection was about 100 ms The industrial applications were controlled by a relay as response to touch detection.

Keywords – Projector-camera Systems, Projector-based Display, Touch Detection, Virtual Touch Screen

Reference

  1. Audet, S., Okutomi, M., and Tanaka, M.(2012).
    Augmenting Moving Planar Surfaces Interactively with Video Projection and a Color Camera. IEEE Virtual Reality (VRW ’12), pages 111-112.
  2. Borkowski, S., Letessier, J., and Crowley, J. L. (2004).
    Spatial Control of Interactive Surfaces in an Augmented Environment. EHCI/DS-VIS Lecture Notes in Computer Science, vol. 3425, pages 228-244.
  3. Borkowski, S., Letessier, J., Bérard, F., and Crowley, J.L.(2006).User-Centric Design of a Vision System for Interactive Applications. IEEE Conf. on Computer Vision Systems (ICVS ‘06), pages 9.
  4. Brutzer, S., Höferlin, B., and Heidemann, G. (2011).
    Evaluation of Background Subtraction Techniques for Video Surveillance. IEEE Conf. on Computer Vision and Pattern Recognition (CVPR ‘11), pages 1937-1944.
  5. Dai, J. and Chung, R. (2012). Making any planar surface into a touch-sensitive display by a mere projector and camera. IEEE Conf. on Computer Vision and Pattern Recognition Workshops (CVPRW ‘12), pages 35-42.
  6. Fujiwara, T. and Iwatani, Y. (2011).Interactions with a Line-Follower: an Interactive Tabletop System with a Marker less Gesture Interface for Robot Control. IEEE Conf. on Robotics and Biomimetics (ROBIO ‘11), pages 2037-2042.
  7. Hilario, M. N. and Cooperstock, J. R. (2004).Occlusion Detection for Front-Projected Interactive Displays. 2nd
    International Conf. on Pervasive Computing and VISAPP 2014 – International Conference on Computer Vision Theory and Applications Advances in Pervasive Computing, Austrian Computer Society.
  8. Homma T., Nakajima K., (2014).Virtual Touch Screen “VIRTOS”-Implementing Virtual Touch Buttons and Virtual Sliders using a Projector and Camera, in Proc. of 9th International Conf. on Computer Vision Theory and Application(VISAPP-2014), pages 34-43.
  9. Kale, A., Kenneth, K., and Jaynes, C. (2004).Epipolar Constrained User Push button Selection in Projected Interfaces. IEEE Conf. on Computer Vision and Pattern Recognition Workshops (CVPRW ‘04), pages 156-163.
  10. Kim, S., Takahashi, S., and Tanaka, J. (2010).New Interface Using Palm and Fingertip without Marker for Ubiquitous Environment. In Proc. of International Conf. on Computer and Information Science (ACIS’10), pages 819-824.
  11. Kjeldsen, R., Pinhanez, C., Pingali, G., and Hartman, J.(2002).
    Interacting with Steerable Projected Displays.International Conf. on Automatic Face and Gesture Recognition (FGR ’02), pages 402-407.
  12. Lech, M. and Kostek, B. (2010).Gesture-based Computer Control System applied to the Interactive Whiteboard. 2nd International Conf. on Information Technology (ICIT ‘10), pages 75-78.
  13. Licsar, A. and Sziranyi, T. (2004). Hand Gesture Recognition in Camera-Projector Systems. Computer Vision in Human-Computer Interaction, Springer, pages 83-93. Microsoft. (2010). Kinect for X-BOX 360. www:http://www.xbox.com/en-US/kinect.
  14. Park, J. and Kim, M.H. (2010).Interactive Display of Image Details using a Camera-coupled Mobile Projector. IEEE Conf. on Computer Vision and Pattern Recognition Workshops (CVPRW ‘10), pages 9-16.
  15. Pinhanez, C., Kjeldsen, R., Levas, A., Pingali, G.,Podlaseck, M., and Sukaviriya, N. (2003). Application of Steerable Projector-Camera Systems. In Proc. of International Workshop on Projector-Camera Systems (PROCAMS-2003).
  16. Sato, Y., Oka, K., Koike, H., and Nakanishi, Y. (2004).Video-Based Tracking of User’s Motion for Augmented Desk Interface. International Conf. on Automatic Face and Gesture Recognition (FGR ’04), pages 805-809.
  17. Shah, S.A.H., Ahmed, A., Mahmood, I., and Khurshid, K. (2011). Hand gesture based user interface for computer using a camera and projector. IEEE International Conf. on Signal and Image Processing Applications (ICSIPA ’11), pages 168-173.
  18. Song, P., Winkler, S., Gilani, S.O., and Zhou, Z. (2007). Vision-Based Projected Tabletop Interface for Finger Interactions. Human–Computer Interaction, Lecture Notes in Computer Science, vol. 4796, Springer, pages 49-58.
  19. Wilson, D. (2005).Play anywhere: a compact interactive tabletop projection-vision system. Proc. 18th ACM Symposium on User Interface Software and Technology (UIST ’05), pages 83-92. Winkler, S., Yu, H., and Zhou, Z. (2007). Tangible mixed reality desktop for digital media management. SPIE: vol. 6490.
  20. Zhang, Z. (2000). A flexible new technique for camera calibration. IEEE Trans. on Pattern Analysis and Machine Intelligence (PAMI), pages 1330-1334.