Tracking The Dark Side

6 min readDec 16, 2022


What is there to find in your eyes?

Imagine a pair of glasses that could detect if you were looking at a far away object and would automatically zoom in for you; a keyboard that types as your eyes glance from word to word; or even a search tool that tells you about that thing you were looking at just a little too long. Futuristic devices like these that can monitor eye movements have a variety of applications ranging from diagnosing eye disorders to providing interactive gaming experiences. The key to designing such systems is gaze-tracking: measuring the eye movement in real-time with precision.

The most popular approach to gaze-tracking today is computer vision: where cameras capture pictures of the eye, and algorithms process the data in real time to determine how the eye is moving. Modern computer vision programs incorporate machine learning algorithms to detect patterns in images and efficiently classify objects. Despite the breakthroughs in this field, these algorithms suffer from substantial computational complexity. Training machine learning algorithms and deciphering images quickly requires high-performance chips which consume a significant amount of power. While feasible in a self-driving car or a smartphone, embedding a computer vision system in a pair of eyeglasses is very challenging.

In the Devices division of Neurotech@Berkeley, we focus on developing hardware to measure and analyze electrical signals generated by our bodies. Inspired by the potential of gaze-tracking systems, we set out to explore a technique with substantially lower computational cost than computer vision: electrooculography (EOG). Unlike computer vision that relies on cameras, EOG relies on intrinsic electrical signals generated in our eye.

What is EOG?

In 1951, Elwin Marg, an American neuroscientist and optometrist at UC Berkeley, discovered and defined electrooculography. EOG is a physiologic test that uses several carefully placed electrodes to measure the standing electrical potentials between Bruch’s membrane (at the back of the eye) and the cornea. Since a dipole is formed between the cornea and the retina, where the cornea has a positive charge relative to the retina, eye movement would change the electrical signal detected by the EOG electrodes. Now, imagine a scenario of two electrodes where one is placed just above the eyebrow and one is placed underneath the eye. As the subject looks up, the cornea moves closer to the top electrode and the Bruch’s membrane moves closer to the bottom electrode — the potential difference between the two electrodes increases! Likewise, as the subject glances down, the potential then decreases. Left and right eye movement can also be detected by placing electrodes to the left and right of the eye. Through such a simple mechanism, EOG can be leveraged to detect blinks, winks, and several other forms of eye movement.

Shortly after its discovery, EOG was primarily applied in the medical field in diagnosis of mental and sleep disorders, assistive technology for those who have sustained injuries, and more. More recently, it is being integrated into more accessible technologies such as applications in virtual reality (VR) and other consummerables for the general public. Notably, EOG data can be used for gaze-tracking. In fact, researchers have been able to detect eye movements as small as 1.5 degrees. Suddenly the gaze-tracking problem we described earlier just became significantly simpler. No longer do we require sophisticated computer vision algorithms.

Among the countless applications for an EOG-based eye-tracker, we think three stand out.

Looking at the Applications

An EOG headset would make it possible to measure signals in real time with high precision, allowing for a number of different applications in a variety of settings.

1. Tracking Consumer Behavior:

There are a number of possible applications for EOG tracking within the realm of ads and product marketing. Currently, a lot of product and marketing research is either done through focus groups or online surveys (sometimes served through ads that are targeted at consumers) but these research methods don’t provide immediate feedback on customers’ reactions to a product. For example, ad designers may want to see what part of an ad copy a potential customer’s eyes are drawn to first. They may want to test whether the design that they’ve created is coming across to consumers as intended. Through the use of an EOG headset they would be able to track in minute detail where customers’ eyes are and what they are drawn to at first glance. They can measure second by second what parts of an ad a consumer is focusing on. In this way, EOG headsets are potentially a cost-effective and scalable way to track where consumers’ visual attention is.

2. Typing Assistance:

On a consumer level, EOG signals can be applied to support keyboard use on a computer. An EOG aid that can bluetooth pair with a keyboard to predict what users are typing based on the textual context and their gaze could be useful to increase typing speed and productivity. The proof of concept for devices like this already exists, products like the Tobii Eye Tracker are already being used by gamers to increase click rate and to perform in-game actions faster than possible with a joystick. The applications extend beyond gaming and productivity. For those unable to type on a standard keyboard and who cannot utilize speech-to-text, on-screen keyboards paired with an EOG tracker can allow for faster communication.

3. Stepping into The Virtual Reality:

EOG could also play a crucial role in increasing the mass appeal of VR and the Metaverse. Today, VR remains a novelty. Clunky headsets, stiff controls, and countless other barriers to immersion have trapped the technology in an uncanny valley where VR environments do not feel convincingly real. By extension, these barriers to VR’s immersion also threaten the viability of the Metaverse as a convincing replacement to real 3D spaces. However, with the help of EOG eye-tracking in VR headsets, these virtual environments could begin to seem a whole lot more real. In today’s VR headsets, users have to move their heads to look around their virtual environments. Since eye movements go unregistered, natural saccadic eye scanning is forcibly replaced by often awkward head movements, contributing greatly to the clunkiness of VR. However, with EOG integrated headsets, participants could look naturally about their virtual environments by moving their eyes. Thus, EOG could both increase the ease

with which users can navigate VR, while also making it more immersive.

Designing Our Next Steps

When acquiring biomedical signals like EOG, there are two main challenges for hardware designers. First, these signals are extremely small in magnitude (on the order of microvolts) which makes them difficult to detect. Second, these signals are so small, these signals are susceptible to sources of electrical noise and interference. For instance, lights and appliances in a room are powered by 60 Hz 120V AC power. These strong power supply signals can “capacitively couple” into our circuit and overwhelm the EOG signal, making it impossible to decipher eye movements from noise.

The first part of our circuit consists of an instrumentation amplifier, which is responsible for rejecting sources of interference like the 60 Hz power supply. The next step is filtering. EOG signals are known to exist between about 0 and 50 Hz. Thus, we pass our EOG signal through filters that reject frequencies outside of this range. Finally, we pass our signal through an operational amplifier circuit. This circuit is responsible for providing a large gain to the signal, amplifying it to a level where it can be detected by a computer.

Additionally, we’ve created a 3D printed headset to mount the circuit and electrodes. We are now in the process of developing efficient classification techniques in Python to distinguish between blinks and other eye movements in our signals. With all of this progress, we are on our way to making our gaze-tracking device a reality.

This article was written by the Devices division of Neurotech@Berkeley. This article was edited by Publications Lead Jacob Marks and VP of Projects Ashwin Rammohan.




We write on psychology, ethics, neuroscience, and the newest in neural engineering. @UC Berkeley