Bio-Inspired Eye Tracker
Background
Objective measurement of gaze pattern and eye movement during untethered activity has important applications for neuroscience research and neurological disease detection. Current commercial eye-tracking tools rely on desktop devices with infrared emitters and conventional frame-based cameras. Although wearable options do exist, the large power-consumption from the conventional cameras limit true long-term mobile usage. The query-driven Dynamic Vision Sensor (qDVS) is a neuromorphic camera which drastically reduces power consumption by outputting only intensity-change threshold events, as opposed to full frames of light intensity data. However, such hardware has not yet been implemented for on-body eye-tracking, but the feasibility can be demonstrated using a mathematical simulator to evaluate the eye-tracking capabilities of qDVS under controlled conditions. Specifically, a framework utilizing a realistic human eye model in the 3D graphics engine, Unity, is presented to enable the controlled and direct comparison of image-based gaze tracking methods. Eye-tracking based on qDVS frames was compared against two different conventional frame eye-tracking methods, the traditional ellipse pupil-fitting algorithm and a deep learning neural network inference model. Gaze accuracy from qDVS frames generated from initial eye model experiments achieved an average of 95.4% for movement along the primary horizontal axis (pitch angle) and 95.9% for movement along the primary vertical axis (yaw angle) under 4 different illumination conditions, demonstrating the feasibility for using qDVS hardware cameras for such applications. The quantitative framework for the direct comparison of eye tracking algorithms presented here is made open-source and can be extended to include other eye parameters, such as pupil dilation, reflection, motion artifact, and more in the future. Beyond presenting this framework, this project also includes a proof of concept schematic for shrinking down the total footprint of the qDVS for potential future applications in marmoset research.
Subprojects
-
PCB Redesign
Decreased the size and weight of the camera and imager (qDVS Extension Board) for future applications with marmoset research.
-
Unity
The performance of three different eye tracking methods was characterized using a generated 3D model in Unity.
-
Image Processing
A MATLAB image processing script was gnerated to be able to quantitively measure eye gaze tracking of the Unity eye model.
-
Canned Experiments
Generated qualitative data to evaluate the performance of the qDVS as it tracks eye gaze.
-
Mechanical Design
Created Solidworks files of preliminary marmoset backpack design concept.
Group Members
-
Serena Tang
4th year Bioengineering major at UCSD, going to UCLA to pursue a Masters in Electrical Engineering.
-
Keli Wang
4th year Bioengineering major at UCSD, going to UCSD to pursue a Masters in Computer Science.
-
Stephanie Ogrey
4th year Bioengineering major at UCSD and incoming Development Engineer I at Illumina.
-
Sana Khan
4th year Bioengineering major at UCSD and current Computer Aided Design Drafter.
-
Jorge Villazon
4th year Bioengineering major at UCSD, going to UCSD to pursue a PhD in Bioengineering.