Evolution, Dinosaurs, and the World’s Biggest Eye-Tracking Experiment

Dr Tim Holmes (standing, centre) testing a participant at the Science Museum. The eye-tracking hardware is the black box just underneath the monitor.

In this post I’m very pleased to be able to write about the work of a good friend and colleague of mine, Dr Tim Holmes, of Royal Holloway University. Tim’s work is on eye-movement tracking and he’s recently finished collecting the data on what may be the world’s largest eye-tracking experiment, which also involved evolution, and dinosaurs. More on that below, but first – what’s eye-movement tracking?

Eye-tracking is the procedure of measuring the position of the eyes, relative to the head, and using that information to infer where someone is looking. The most popular method of doing it is to use a video-camera (often an infra-red camera) to record the participants’ eyes. The infra-red light provides good contrast with the pupil (which shows up as dark) and a reflection from the cornea (bright) and using some relatively simple trigonometry the position of gaze on a surface in front of the participant can be computed. Eye-trackers used to be pretty clunky devices that required the participant to sit absolutely still, and also needed a lot of calibration, however some moderns systems can track the position of the head as well and subtract it from the eye-movement data, which means the subject can move their head around naturally. There are also lightweight, wireless, head-mounted systems available which means eye-movements can be recorded in natural environments with people moving around.*

Two types of modern eye-movement tracking systems - Infra-red cameras built into the frame of a standard monitor (left), and a lightweight wireless system that can be worn in natural environments. Both systems manufactured by Tobii Technology.

Why is this interesting for psychologists though? The eye (or more accurately, the retina) is part of the central nervous system, and in fact is the only part of the brain that you can see from outside the body. Studying the eyes is therefore in a very real sense studying the brain. Eye-tracking has been used for a whole variety of applications. A lot of early work was focused on reading, and showed that the eyes don’t move smoothly over words – there are distinct pattern of fixation and saccades (the ‘moving’ part of eye-movements) which are associated with reading, and this pattern differs in some interesting ways in people with reading problems like dyslexia. Eye movements have also been shown to be different in a variety of other disorders, like schizophrenia and autism. Like most things involving the brain, if you dig deep enough, eye-movements are incredibly complex and can reveal a whole range of functions and effects. One of the effects that has been demonstrated is that (put very simply) the more we like something, or the more we find it engaging, the more time we tend to spend looking at it. For instance, even newborn babies will spend more time looking at pictures of faces, than at pictures of household objects. Faces are important stimuli for babies, and we appear to be hard-wired to pay close attention to them.

This preferential-looking effect can be exploited in various ways in order to gauge participants’ reactions to different stimuli. A market research company might use it to evaluate reactions to different products, advertisements or packaging. The benefit is that you don’t have to ask your participants anything – you just tell them to look at the pictures on the screen. The data you get is (perhaps) more reliable as it doesn’t depend on verbal reports from the participants (who might lie, or express themselves poorly for other reasons).

This brings me directly to the work that my colleague Tim’s been doing recently. Tim had the bright idea of extending this technique and making it interactive, so that what’s displayed on the screen changes throughout the experiment, and the changes are based on what the participants have been doing in the earlier trials. The way he managed this was to use an evolutionary algorithm; 16 pictures are presented (four at a time) on each trial (or evolutionary ‘generation’) and the data about which ones the participants look at most is recorded. This data then feeds back into the program, the least looked-at pictures ‘die’, and the other pictures ‘survive’ to pass their characteristics to the next trial/generation.  By also introducing variations (‘mutations’) into aspects of the pictures like shape, colour, or size, eventually an optimised stimulus is reached, which is most engaging for that particular subject. Evolutionary algorithms like this are used a lot in computing for various purposes, and the degree to which they model actual biological evolution varies, but this is the first time that such an approach has been combined with eye-tracking, and it’s a very, very cool technique. The participant doesn’t even need any instructions – all they’re told is to sit still and look at the screen, and purely by the pattern of their eye-movements an optimal stimulus for that person can be ‘evolved’ in real time.

One frame from the evolving dinosaurs experiment which ran at the Science Museum recently.

After getting this technique working properly Tim was asked by the London Science Museum to be part of the “Who Am I?” exhibit that was recently running. Tim made a version of his experiment that involved cartoon dinosaurs, and visitors to the museum were able to take part in the experiment and ‘evolve’ their own optimal dinosaur in about 5 minutes. Over the 11 weeks that the exhibit was in place Tim and his team were able to test 1400 people (aged from 3 to 76, and from 45 countries!) , which certainly means it’s one of the biggest eye-movement experiments ever conducted, and maybe even the biggest ever. A lot more information about the experiment and some really good explanatory videos can be found on the project’s website. Tim tells me he’s currently analyzing the massive amount of data that was acquired in the project, and I’ll definitely post an update here when he’s finished and has some results to share. If you’re desperate to try the task yourself, it will be running again as part of Royal Holloway University’s Science Open Day in February 2012.

Tim’s next projects are working with colleagues at the University of Liverpool on a study of preference for symmetry, and working with a charity called SpecialEffect to adapt his eye-gaze-driven evolutionary algorithm into a computer interface and learning tool for children with disabilities. This is, needless to say, totally awesome, and furthermore shows that even the most technical and data-driven areas of psychology research (like eye-movements) really can have solid, real world applications, given the right context and most importantly, the right technology.

*Also, eye-tracking/head-tracking at the lower resolution end is becoming much cheaper and more accessible and can now be done using simple webcams. The precision of such systems is much less (about 5 degrees of visual angle) than specialised hardware, but still good enough for a whole range of applications.  You can expect to see this technology coming to a PC (and even an iPad) near you very soon!


About Matt Wall

I do brains. BRAINZZZZ.

Posted on October 26, 2011, in Cool new tech, Experimental techniques, Hardware, Programming, Software and tagged , , , , , , , , , . Bookmark the permalink. 1 Comment.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: