The aim was simple: to see how the EyeTribe would hold against the industry standard. So that’s what I did! For various practical reasons, I decided not to use the RED-m, but to compare only the EyeTribe and the EyeLink 1000. The second-best alternative is to test each tracker on the same participants in two consecutive recordings. That is a huge setback: if you want to do a proper, direct comparison of two tracker, you should really do it on the exact same eye movements. So simultaneous recordings were out of the question. (Eye trackers work by comparing the position of the pupil with the position of one or more glints: reflections of the infrared light sources on an eyeball.) It turns out that the amount of light sources is crucial to the algorithms of each tracker. The EyeLink 1000 works with a single source of infrared light, whereas the RED-m and the EyeTribe work with two sources.
Only to realise that it was an absolute disaster. It’s not easy to script a combined calibration and recording for three different eye trackers.
The RED-m is a portable eye tracker manufactured by SensoMotoric Instruments, with a slightly higher sampling rate (120 Hz) and comparable accuracy and precision to the EyeTribe. The EyeLink 1000 is a powerhouse that sports a 1000 Hz sampling rate and a high accuracy and precision, arguably the best video-based eye tracker out there. My first idea was to create a setup in which participants’ eye movements were recorded simultaneously by three trackers: the EyeLink 1000, the SMI RED-m, and the EyeTribe. Once I could communicate with the EyeTribe via PyGaze, I could start comparing it with other trackers. In fact, UCL postdoc Benjamin de Haas helped me debug a few timing issues, and Cambridge PhD Jan Freyberg ( improved the calibration routine. That doesn’t mean my efforts went to waste: people from other labs in my department are putting it to good use, and I’ve received emails with questions and/or thanks from people all over the world. Although I wrote it to benefit my own lab, none of the lab members are currently using it (we prefer the beefier EyeLink 1000). In addition, I dusted of my Matlab installation and programmed a simple EyeTribe Toolbox for Matlab. (If you want to use the EyeTribe tracker with PyGaze, simply set the TRACKERTYPE to ‘eyetribe’.) The result was (), which I quickly integrated into PyGaze. This was easy enough, due to the incredibly brilliant API. So I decided to write my own wrapper to be able to interface with the EyeTribe via Python (my favourite language). Although I have some experience in C++, I’m not fluent in any of these three languages.
The EyeTribe comes with a Software Developers Kit (SDK) and tutorials for three programming languages: C#, C++, and Java.
A TL DR in a post provides a short summary.) Interweb nerds tend to write this if they feel a post is too long to read, mostly due to their low attenshun span. I highly advise using a chin rest, though. It’s not as bad as you would expect from its price, and you could probably use it in fixation or pupillometry studies. The EyeTribe tracker is really cheap, and you can now use it in Python and in Matlab. This is suspiciously cheap, so I set to find out if the tracker was any good at all. It introduced the EyeTribe tracker: a portable eye tracker with a never-seen-before price tag of $100. It’s based on an ATmega328P microcontroller and runs on software Cruz wrote himself.Relatively recently, a new player came to the eye-tracker market. Cruz integrated the electrodes into a pair of glasses and connected it to an amplifier. Attaching two electrodes to a person’s head near his or her eyes can capture this change. As a person’s eye moves, the negative pole moves relative to the person’s face. The eye acts as a dipole, with the cornea positively charged and the retina negatively charged. It is based on electrooculography, which takes advantage of the eyes’ electrical potential. Recent high school grad Luis Cruz, 18, started out tinkering with video game technology, and says he designed the Eyeboard to help people communicate using eye movements. Now a Honduran teenager has an eye tracker that solves the problem: A $300 open-source kit meant for people with disabilities. But these are all fairly pricey and complex, making them niche devices rather than widely adoptable tools.
The gadget world is full of neat eye-tracking interfaces, from an iPhone version to a fully functioning laptop.