Blog Archives

Seriously cool toys – Tobii mobile eye-tracking glasses, Pivothead HD video-recording eye-wear, and the Affectiva Q-sensor

The Tobii mobile eye-tracking system. Awesome.

The other day I was lucky enough to be able to help out with a bit of data-collection in a well-known London department store, being run by the magnificent Tim Holmes of Acuity Intelligence. This meant that I got to examine some seriously cool bits of new hardware – and new gadgets (especially scientific ones) are basically my kryptonite, so it was generally pretty exciting.

The first thing we used was a mobile eye-tracking system designed and built by Tobii. These have two cameras in – one front-facing to record video of what the participant is looking at, and another infra-red camera to record the participant’s exact eye-position. They can also capture sound in real-time too, and record the eye-tracking data at 30Hz. The system comes with a clip-on box where the data is actually recorded (in the background of the picture on the right) and which is also used for the (fairly brief and painless) initial calibration. It seems like a really great system – the glasses are very light, comfortable and unobtrusive – and could have a really broad range of applications for research, both scientific and marketing-related.

The next cool toy I got to play with was a pair of these:

Pivothead ‘Durango’ HD video-recording glasses. Double awesome.

These are glasses with a camera lens in the centre of the frame (between the eye-lenses) which can record full high-definition video – full 1080p at 30 fps, using an 8Mp sensor. Amazing! They have an 8GB onboard memory which is good for about an hour of recording time, and also have a couple of discreet buttons on the top of the right arm which can be used for taking still pictures in 5-picture burst or 6-picture time-lapse mode. They’re made by a company called Pivothead, and seem to be more intended for casual/recreational/sports use rather than as a research technology (hence the ‘cool’ styling). They’re a reasonably bulky pair of specs, but very light and comfortable, and I don’t think you’d attract much attention filming with them. It’s worth checking out the videos page at their website for examples of what they can do. They’re also only $349 – a lot for a pair of sunglasses, but if you can think of a good use for them, that seems like a snip. If you’re in the UK, they’re also available direct from the Acuity Intelligence website for £299, inc. VAT. I wonder how long it’ll be before they start showing up in law-enforcement/military situations?

The third device I got to geek-out over was one of these little beauties:

The Affectiva mobile, wrist-worn, bluetooth GSR sensor. Triple awesome.

This is a ‘Q-Sensor’, made by a company called Affectiva and is about the size of an averagely chunky wristwatch. It has two little dry-contact electrodes on the back which make contact with the skin on the underside of the wrist, and also contains a 3-axis accelerometer and a temperature sensor. This little baby claims to be able to log skin conductance data (plus data from the other sensors) for 24 hours straight on a single charge, and will even stream the data ‘live’ via Bluetooth to a connected system for on-the-fly analysis. It seems like Affectiva are mainly pitching it as a market research tool, but I can think of a few good ‘proper’ research ideas that this would enable as well. This is seriously cool technology.

That’s all folks – TTFN.

Evolution, Dinosaurs, and the World’s Biggest Eye-Tracking Experiment

Dr Tim Holmes (standing, centre) testing a participant at the Science Museum. The eye-tracking hardware is the black box just underneath the monitor.

In this post I’m very pleased to be able to write about the work of a good friend and colleague of mine, Dr Tim Holmes, of Royal Holloway University. Tim’s work is on eye-movement tracking and he’s recently finished collecting the data on what may be the world’s largest eye-tracking experiment, which also involved evolution, and dinosaurs. More on that below, but first – what’s eye-movement tracking?

Eye-tracking is the procedure of measuring the position of the eyes, relative to the head, and using that information to infer where someone is looking. The most popular method of doing it is to use a video-camera (often an infra-red camera) to record the participants’ eyes. The infra-red light provides good contrast with the pupil (which shows up as dark) and a reflection from the cornea (bright) and using some relatively simple trigonometry the position of gaze on a surface in front of the participant can be computed. Eye-trackers used to be pretty clunky devices that required the participant to sit absolutely still, and also needed a lot of calibration, however some moderns systems can track the position of the head as well and subtract it from the eye-movement data, which means the subject can move their head around naturally. There are also lightweight, wireless, head-mounted systems available which means eye-movements can be recorded in natural environments with people moving around.*

Two types of modern eye-movement tracking systems - Infra-red cameras built into the frame of a standard monitor (left), and a lightweight wireless system that can be worn in natural environments. Both systems manufactured by Tobii Technology.

Why is this interesting for psychologists though? The eye (or more accurately, the retina) is part of the central nervous system, and in fact is the only part of the brain that you can see from outside the body. Studying the eyes is therefore in a very real sense studying the brain. Eye-tracking has been used for a whole variety of applications. A lot of early work was focused on reading, and showed that the eyes don’t move smoothly over words – there are distinct pattern of fixation and saccades (the ‘moving’ part of eye-movements) which are associated with reading, and this pattern differs in some interesting ways in people with reading problems like dyslexia. Eye movements have also been shown to be different in a variety of other disorders, like schizophrenia and autism. Like most things involving the brain, if you dig deep enough, eye-movements are incredibly complex and can reveal a whole range of functions and effects. One of the effects that has been demonstrated is that (put very simply) the more we like something, or the more we find it engaging, the more time we tend to spend looking at it. For instance, even newborn babies will spend more time looking at pictures of faces, than at pictures of household objects. Faces are important stimuli for babies, and we appear to be hard-wired to pay close attention to them.


This preferential-looking effect can be exploited in various ways in order to gauge participants’ reactions to different stimuli. A market research company might use it to evaluate reactions to different products, advertisements or packaging. The benefit is that you don’t have to ask your participants anything – you just tell them to look at the pictures on the screen. The data you get is (perhaps) more reliable as it doesn’t depend on verbal reports from the participants (who might lie, or express themselves poorly for other reasons).

This brings me directly to the work that my colleague Tim’s been doing recently. Tim had the bright idea of extending this technique and making it interactive, so that what’s displayed on the screen changes throughout the experiment, and the changes are based on what the participants have been doing in the earlier trials. The way he managed this was to use an evolutionary algorithm; 16 pictures are presented (four at a time) on each trial (or evolutionary ‘generation’) and the data about which ones the participants look at most is recorded. This data then feeds back into the program, the least looked-at pictures ‘die’, and the other pictures ‘survive’ to pass their characteristics to the next trial/generation.  By also introducing variations (‘mutations’) into aspects of the pictures like shape, colour, or size, eventually an optimised stimulus is reached, which is most engaging for that particular subject. Evolutionary algorithms like this are used a lot in computing for various purposes, and the degree to which they model actual biological evolution varies, but this is the first time that such an approach has been combined with eye-tracking, and it’s a very, very cool technique. The participant doesn’t even need any instructions – all they’re told is to sit still and look at the screen, and purely by the pattern of their eye-movements an optimal stimulus for that person can be ‘evolved’ in real time.

One frame from the evolving dinosaurs experiment which ran at the Science Museum recently.

After getting this technique working properly Tim was asked by the London Science Museum to be part of the “Who Am I?” exhibit that was recently running. Tim made a version of his experiment that involved cartoon dinosaurs, and visitors to the museum were able to take part in the experiment and ‘evolve’ their own optimal dinosaur in about 5 minutes. Over the 11 weeks that the exhibit was in place Tim and his team were able to test 1400 people (aged from 3 to 76, and from 45 countries!) , which certainly means it’s one of the biggest eye-movement experiments ever conducted, and maybe even the biggest ever. A lot more information about the experiment and some really good explanatory videos can be found on the project’s website. Tim tells me he’s currently analyzing the massive amount of data that was acquired in the project, and I’ll definitely post an update here when he’s finished and has some results to share. If you’re desperate to try the task yourself, it will be running again as part of Royal Holloway University’s Science Open Day in February 2012.

Tim’s next projects are working with colleagues at the University of Liverpool on a study of preference for symmetry, and working with a charity called SpecialEffect to adapt his eye-gaze-driven evolutionary algorithm into a computer interface and learning tool for children with disabilities. This is, needless to say, totally awesome, and furthermore shows that even the most technical and data-driven areas of psychology research (like eye-movements) really can have solid, real world applications, given the right context and most importantly, the right technology.

*Also, eye-tracking/head-tracking at the lower resolution end is becoming much cheaper and more accessible and can now be done using simple webcams. The precision of such systems is much less (about 5 degrees of visual angle) than specialised hardware, but still good enough for a whole range of applications.  You can expect to see this technology coming to a PC (and even an iPad) near you very soon!