Methodology

The pyETA tool is built to handle eye-tracking data with a real-time focus. The idea is all about grabbing raw gaze data, processing it with a velocity-based approach (I-VT), and producing fixation start timestamps & duration to assist the researcher.

The toolbox leans on tobii_research for communicating with the eye-tracker device, PyQt6 for the GUI, click for CLI and pylsl / mne_lsl for LSL streaming.

Core Components

  • Data Collection: Hooks into a Tobii eye tracker or mocks it if you’re testing.

  • Processing: Uses velocity thresholds to tag fixations, smoothed out with OneEuroFilter.

  • Validation: Cross-checks gaze against known targets of 3x3 grid for accuracy/precision.

  • Visualization: Real-time gaze plot & fixation tracking plot.

Workflow

application.py orchestrates two main use cases in the GUI with reader.TrackerThread and reader.StreamThread for tracking and validation process

track.py in CLI is does the tracking and window.py does the validation process.

StreamThreadthread runs in the background and gets data from an LSL stream on-demand, the LSL streams is being send continuously by TrackerThread and being stored in buffer of max length 10000 for gaze and 10 for fixation.

Tracking uses TrackerThread + StreamThread threads to plot the graphs

Last updated