Eyes are windows to our soul
William Shakespeare

Eye movements in biometrics

Eyes are one of the most complicated human organs and analyses of eye movements may reveal a lot of information about the human being. There are a lot of studies that analyzes eye movements in order to diagnose specific diseases or to recognize state of mind (Duchowski 2002). However, surprisingly, there are only a few researches trying to differentiate people basing on their eye movements characteristic. Biometric identification based on eye movements compiles physiological and behavioral aspects. It makes forging very difficult. One may try to imitate somebody’s signature or the way he is walking but so far it seems to be impossible to imitate the way people move their eyes.

The first publication about eye-movement biometric authentication was a poster presentation during 6th World Conference Biometrics'2003 in London (Kasprowski et al., 2003). The poster achieved Best Poster on Technological Advance Prize what encouraged authors to further research, what resulted in several publications including (Kasprowski et al., 2004) and (Kasprowski et al., 2005). Another attempt was (Bednarik et al. 2005). Despite of quite complex experiments including text reading, moving point observation and image viewing, finally people were identified basing only on the way they look at static point (red cross) in the middle of the screen. Similarly to (Kasprowski at al, 2004) the data was preprocessed using FFT and than reduced with PCA method. The authors considered four different features from which only "velocity of gaze" was truly dynamic. The following publication was (Nishigaki at al. 2008) in which authors compared people by measuring the placement of their blind spot. Two new researches were reported during ETRA 2010 conference. (Kinnunen at al. 2010) proposed a very interesting solution based on so called 'task-independent' authentication. They didn’t use any special stimulation and just recorded eye movements when subject watched a movie. The results obtained weren’t very good (error rates about 30%) but encouraging. Another work was (Komogortsev at al 2010). The authors performed experiment very similar to (Kasprowski at al 2004) with moving point stimulation. The main difference was applying so-called Oculomotor Plant Mathematical Model (OPMM) developed by authors, what improved their results. The work was continued in (Komogortsev at al 2012) where authors improved the methods and established a very thorough performance baseline. Another important work for those who work in the field is (Holland at al 2011), because it establishes performance baselines for very fundamental characteristics of the human visual system and the brain strategy for guiding visual attention via scanpaths.

The main problem for today’s state of art is the number of available samples. Most eye movements experiments tend to collect samples from different subjects. To avoid learning effect, repetitions of the same experiment with the same subject are rare. And that is exactly what is needed for identification. We need as many as possible measurements of the same subjects to find eye movements’ characteristics specific to them. That is why it is rarely possible to use results of experiments taken for reason other than human identification. We need a lot of data gathered specifically for such kind of recognition.

Moreover, the results of experiments reported in several researches above are difficult to compare. We have different eye trackers, different frequencies, different subjects and different methodologies. To push technology forward we need data and comparable results. So, our idea was to build a generally available database of eye movement samples. We decided to publish our dataset and provide a competition to find what conversions, classifiers and generally algorithms will give the best results in classic identification and verification tests. What is very important such dataset may be analyzed even by researches that don’t have direct access to eye trackers.




Eye movements

EMD format

Competition formula



My profile

Results (NEW)



(C)2012 Pawe³ Kasprowski