1,329
views
0
recommends
+1 Recommend
1 collections
    12
    shares

      Celebrating 65 years of The Computer Journal - free-to-read perspectives - bcs.org/tcj65

      scite_
       
      • Record: found
      • Abstract: found
      • Conference Proceedings: found
      Is Open Access

      Intelligent Finger Movement Controlled Interface for Automotive Environment

      proceedings-article
      , ,
      Proceedings of the 31st International BCS Human Computer Interaction Conference (HCI 2017) (HCI)
      digital make-believe, with delegates considering our expansive
      3 - 6 July 2017
      Finger Tracking, Eye gaze Tracking, Multimodal Fusion
      Bookmark

            Abstract

            This paper presents an intelligent finger tracking system to operate infotainment systems or instrument panels in automotive environment. We developed algorithms to control an on-screen pointer using off-the-shelf infra-red sensors and reported a user study on pointing and selection tasks undertaken using the finger tracking system. We proposed polynomial models to predict probable targets and a Bayesian Fusion Model to integrate eye gaze locations of users with their finger tracks. The predictive fusion model resulted in less than 2 seconds pointing and selection times on average inside a car running on a motorway.

            Content

            Author and article information

            Contributors
            Conference
            July 2017
            July 2017
            : 1-5
            Affiliations
            [0001]Centre for Product Design and Manufacturing

            Indian Institute of Science

            Bangalore 560012, India
            [0002]Department of Engineering

            University of Cambridge

            Cambridge, UK
            Article
            10.14236/ewic/HCI2017.25
            46733346-38c3-4d24-b7f0-8421820cdd7a
            © Biswas et al. Published by BCS Learning and Development Ltd. Proceedings of British HCI 2017 – Digital Make-Believe. Sunderland, UK.

            This work is licensed under a Creative Commons Attribution 4.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/

            Proceedings of the 31st International BCS Human Computer Interaction Conference (HCI 2017)
            HCI
            31
            Sunderland, UK
            3 - 6 July 2017
            Electronic Workshops in Computing (eWiC)
            digital make-believe, with delegates considering our expansive
            History
            Product

            1477-9358 BCS Learning & Development

            Self URI (article page): https://www.scienceopen.com/hosted-document?doi=10.14236/ewic/HCI2017.25
            Self URI (journal page): https://ewic.bcs.org/
            Categories
            Electronic Workshops in Computing

            Applied computer science,Computer science,Security & Cryptology,Graphics & Multimedia design,General computer science,Human-computer-interaction
            Eye gaze Tracking,Finger Tracking,Multimodal Fusion

            references

            1. 2015 Intelligent Intent-aware Touchscreen Systems Using Gesture Tracking with Endpoint Prediction Proc. of the 17th International Conference on Human Computer Interaction (HCII '15) 2015

            2. 2016 Exploring the use of Eye Gaze Controlled Interfaces in Automotive Environments Springer 2016 ISBN: 978-3-319-40708-1

            3. 2011 2011 Haptic Seat Interfaces for Driver Information and Warning Systems International Journal of Human-Computer Interaction 27 12 1119 1132 DOI:10.1080/10447318.2011.555321

            4. 2015 Building an Adaptive Multimodal Framework for Resource Constrained Systems A Multimodal End-2-End Approach to Accessible Computing 2nd Springer 2015

            5. 2013 2013 Generating a Personalized UI for the Car: A User-Adaptive Rendering Architecture UMAP 2013, LNCS 7899 344 346

            6. Ford 2015 Ford Sync Voice Command System Available at http://owner.ford.com/how-tos/sync-technol-ogy/myford-touch/get-started-with-sync/sync-voice-commands-by-category.html 2 11 2015

            7. 2010 2010 Making Use of Drivers' Glances onto the Screen for Explicit Gaze-Based Interaction Proceedings of the Second International Conference on Automotive User Interfaces and Interactive Vehicular Applications

            8. 2015 2015 Utilization of Visual Information Perception Characteristics to Improve Classification Accuracy of Driver's Visual Search Intention for Intelligent Vehicle International Journal of Human-Computer Interaction DOI:10.1080/10447318.2015.1070561

            9. Leap 2015 Leap Motion Controller Available at https://www.leapmotion.com/ 4 11 2015

            10. Lexus 2015 Lexus Voice Command System Available at http://drivers.lexus.com/t3Portal/document/om-nav/OM48C25U/pdf/07.pdf 2 11 2015

            11. 2015 2015 Design and Evaluation of a Touch-Based Personalizable In-vehicle User Interface International Journal of Human-Computer Interaction DOI:10.1080/10447318.2015.1045240

            12. 2014 2014 Hand Gesture Recognition in Real-Time for Automotive Interfaces: A Multimodal Vision-based Approach and Evaluations IEEE Transactions on Intelligent Transportation Systems

            13. 2011 2011 Gaze-based interaction on multiple displays in an automotive environment IEEE International Conference on Systems, Man, and Cybernetics (SMC) 2011 543 548 ISSN : 1062-922X

            14. 2002 Information Fusion and Person Verification Using Speech & Face Information Research Paper IDIAP-rR 02-33 (2002)

            15. 2013 Pointing in the air: Measuring the effect of hand selection strategies on performance and effort Proceedings of South-CHI 2013

            16. SUS 2014 System Usability Scale Available at http://en.wikipedia.org/wiki/System_usability_scale 12 07 2014

            17. Tobii 2015 Tobii EyeX Eye Tracker http://www.tobii.com/xperience/ 31st August 2015

            Comments

            Comment on this article