Blog Archives

iPad app in development to help with macular degeneration

CachedImageI’ve written before about iPad apps useful for vision research, but I’ve just come across a new vision-related app, so new in fact that it’s still in the development/testing phase. It’s been produced by my old Colleague Prof. Robin Walker at Royal Holloway University and is designed as a rehabilitative tool for people with Macular Degeneration (MD).

Age-related MD is by far the most common form of blindness/vision-loss in people over 50, and involves degeneration of the visual sensitivity of the centre portion of the retina – the part of the eye which has the highest density of rods and cones. This makes tasks such as reading and recognising faces more and more difficult as the condition progresses. One way of mitigating the effects is to try and use portions of the retina which are less affected, i.e. the periphery. For reading, the ‘eccentric-vision’ and ‘steady-eye’ techniques involve fixating at a point and then moving the text through areas of the visual field which are less affected. These techniques require some practice to counteract the natural tendency to make eye-movements when reading, and it’s this training process that the app is intended to help with.

Read more about the app here, and there’s also a (pay-walled) article in the British Journal of Opthalmology here.

Video analysis software – Tracker

I just came across a gosh-darn drop-dead cool (and free!) piece of software that I just had to write a quick post on. It’s called Tracker, it’s cross-platform, open-source and freely available here.

In a nutshell, it’s designed for analysis of videos, and can do various things, like track the motion of an object across frames (yielding position, velocity and acceleration data) and generate dynamic RGB colour profiles. Very cool. As an example of the kinds of things it can do, see this post on where a physicist uses it to analyse the speed of blaster bolts in Star Wars: Episode IV. Super-geeky I know, but I love it.

An example of some motion analyses conducted using Tracker

Whenever I see a piece of software like this I immediately think about what I could use it for in psychology/neuroscience. In this case, I immediately thought about using it for kinematic analysis – that is, tracking the position/velocity/acceleration of the hand as it performed movements or manipulates objects. Another great application would be for analysis of movie stimuli for use in fMRI experiments. Complex and dynamic movies could be analysed in terms of the movement (or colour) stimuli they contain and measures produced which represent movement across time. Sub-sampled versions of these measures could then be entered into a fMRI-GLM analysis as parametric regressors to examine how the visual cortex responds; with careful selection of stimuli, this could be quite a neat and interesting experiment.

Not sure I’ll ever actually need to use it in my own experiments, but it looks like a really neat piece of software which could be a good solution for somebody with a relevant problem.


iPad app for generating visual psychophysics stimuli

I’ve been meaning to write a new post which would be an update to my previous one on good psychology-related iPhone/iPad apps for a while now, but I just came across one app which is just too good not to share immediately. It’s a free app called RFSpotter, written by Nicolas Cottaris of the  IRCS and Dept. of Psychology at The University of Pennsylvania, and it generates simple visual psychophysics stimuli for use in mapping receptive fields and the tuning properties thereof. It has a very slick interface, where stimulus size, position and rotation can all be controlled by the usual iOS finger-gestures (e.g. pinch-to-zoom to change stimulus size, two-finger rotation for orientation) with many other parameters editable through a pop-up menu. It will do gratings, patches, dot-clouds, coloured stimuli – all kinds of things! Very, very neat indeed.

See this page for more details and a video of it in action, and visit the iTunes store here to download it.

Some screenshots:

The iPad really has the potential to be a serious platform for research, and it’s tools like this that will make it possible to do some really interesting work with it – here’s hoping we see many more specialist, research-oriented apps like this in the future!

Image Morphing and Psychology Research – A Case Study

As an example of the ways in which technology and psychology have developed together recently, I thought it would be fun to do a little case-study of a particular area of research which has benefitted from advances in computer software over recent years. Rather than talk about the very technical disciplines like brain imaging (which have of course advanced enormously recently) I thought it would be more fun to concentrate on an area of relatively ‘pure’ psychology, and one of the most important and fundamental cognitive processes which is present pretty much from birth; face perception.

In November 1991 Michael Jackson released the single ‘Black or White’; the first to be released from his eighth album ‘Dangerous’. The single is of some significance as it marked the beginning of Jackson’s descent from the firmament of music stardom into the spiral of musical mediocrity and personal weirdness which only ended with his death in 2009, but for the purposes of the present discussion it was interesting because of part of its accompanying video. Towards the end of the video a series of people of both sexes and of various ethnic groups are shown singing along with the song and the images of their faces morph into each other in series:

Read the rest of this entry