Show simple item record

dc.contributor.authorTüretken, Engin
dc.contributor.authorSaeedi, Sareh
dc.contributor.authorBigdeli, Siavash
dc.contributor.authorStadelmann, Patrick
dc.contributor.authorCantale, Nicolas
dc.contributor.authorLutnyk, Luis
dc.contributor.authorRaubal, Martin
dc.contributor.authorDunbar, L. Andrea
dc.date.accessioned2022-06-03T10:13:37Z
dc.date.available2022-06-03T10:13:37Z
dc.date.issued2022-03-02
dc.identifier.citationEngin Türetkin, Sareh Saeedi, Siavash Bigdeli, Patrick Stadelmann, Nicolas Cantale, Luis Lutnyk, Martin Raubal, Andrea L. Dunbar, "Real time eye gaze tracking for human machine interaction in the cockpit," Proc. SPIE 12019, AI and Optical Data Sciences III, 1201904 (2 March 2022)en_US
dc.identifier.urihttps://yoda.csem.ch/handle/20.500.12839/1025
dc.description.abstractThe Aeronautics industry has pioneered safety from digital checklists to moving maps that improve pilot situational awareness and support safe ground movements. Today, pilots deal with increasingly complex cockpit environments and air traffic densification. Here we present an intelligent vision system, which allows real-time human-machine interaction in the cockpits to reduce pilot’s workload. The challenges for such a vision system include extreme change in background light intensity, large field-of-view and variable working distances. Adapted hardware, use of state-of-the-art computer vision techniques and machine learning algorithms in eye gaze detection allow a smooth, and accurate real-time feedback system. The current system has been over-specified to explore the optimized solutions for different use-cases. The algorithmic pipeline for eye gaze tracking was developed and iteratively optimized to obtain the speed and accuracy required for the aviation use cases. The pipeline, which is a combination of data-driven and analytics approaches, runs in real time at 60 fps with a latency of about 32ms. The eye gaze estimation error was evaluated in terms of the point of regard distance error with respect to the 3D point location. An average error of less than 1.1cm was achieved over 28 gaze points representing the cockpit instruments placed at about 80-110cm from the participants’ eyes. The angular gaze deviation goes down to less than 1° for the panels towards which an accurate eye gaze was required according to the use cases.en_US
dc.description.sponsorshipPEGGASUS project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461.en_US
dc.language.isoenen_US
dc.subjectGaze-based interactionen_US
dc.subjectEye gaze detectionen_US
dc.subjectAviationen_US
dc.subjectComputer visionen_US
dc.subjectMachine learningen_US
dc.subjectHuman-machine interactionen_US
dc.titleReal Time Eye Gaze Tracking for Human Machine Interaction in the Cockpiten_US
dc.typeProceedingsen_US
dc.type.csemdivisionsDiv-Men_US
dc.type.csemresearchareasData & AIen_US
dc.type.csemresearchareasIoT & Visionen_US
dc.identifier.doihttp://dx.doi.org/10.1117/12.2607434


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

  • Research Publications
    The “Research Publications” collection provides bibliographic information for scientific papers including conference proceedings and presentations.

Show simple item record