Putting innovation to work for real training needs: making the invisible visible

Do you remember the first time you were driving on the highway and had to pass a slow-driving truck? You probably didn’t master the skill of driving very well yet, which made you a bit anxious and extra alert on the specific procedures, such as checking your mirrors to see if the left lane is clear. Meanwhile, the instructor wanted you to move your head exaggeratedly, in order to know if you did or did not check these mirrors.

Now imagine that you are a pilot instructor. Up in the air it is not very common to pass another airplane, but scan patterns are indeed a very important aspect of flying an aircraft. Similar mistakes can be made if the pilot does not check certain indicators in the cockpit. However, obviously moving your head will not give the instructor an idea of whether you looked at the speed or the altitude indicator, since these indicators are located in very close proximity.

The technical solution
Objective information regarding students’ scan patterns eliminates this lack and uncertainty of information. Eye tracking (a measurement instrument that indicates where you are looking) has become increasingly popular over the last couple of decades, but does it cover all the information an instructor needs? No.

I hope that pilots will agree with me that performing an approach and landing requires different scan behaviour than solving an engine failure. The information presented to the instructor should therefore align with the scenario being trained. Unfortunately, too many researchers and eye tracking companies only focus on the technical part of the solution. I think it is important that we start to realize that when training is left out of the equation, instructors will be left with a tool that will (in time) feel useless and complicated, rather than supportive and intuitive.

The integrated solution
Therefore, we developed a tool which supports instructors with real-time, objective information about the scan patterns of their students related to the training scenario: the Augmented Eye. This facilitates performance based training, and can eventually lead to more personal, adaptive training. Knowing this, it might not surprise you that all instructors who experienced the Augmented Eye were as excited about the tool as we are!

A fancy usability extra is that the instructor can stay seated behind the pilots and see where his students are looking, without shifting his head to a tablet or instructor screen. This is possible due to Augmented Reality: the eye tracking data is virtually displayed over the real world environment. Meanwhile, the student can perform the same routines as without the tool: they can wiggle in their chair and do not need to wear special glasses.

Cool, but what about the future?
Regardless of whether instructors are going to use the Augmented Eye or another new application, I am convinced that innovation in training always needs to focus on the combination of technical opportunities and real training needs – just like Augmented Eye. So that’s exactly what we are working on to improve.

Watch the Augmented Eye video

Jeanine Vlasblom R&D Engineer - Royal NLR

Jeanine Vlasblom

R&D Engineer
Royal NLR

Best paper award ITEC 2019

Jeanine received the ‘ITEC 2019, Best Paper award’ for her paper on ‘Making the invisible visible – Towards increasing pilot training effectiveness by visualizing scan patterns of trainees through AR’. Jeanine’s paper was selected the best out of 67 submissions.

“Your presentation was very professional and your research topic represented the theme of designing interoperability in a purposeful manner to connect people through technology.”

The paper describes the development and evaluation of a scan pattern monitoring system using augmented reality. The system enables instructors to monitor scan patterns of pilots by non-intrusively tracking the pilot’s eyes and displaying the scan patterns to the instructor through augmented reality. Subject matter experts (pilot instructors) evaluated this application as a support for the debriefing. Further development should focus on creating a tablet version for use during the debriefing and on the best ways to implement this in pilot training.