Stream Generation
The pyETA GUI initiates a 22-channel Lab Streaming Layer (LSL) stream named 'tobii_gaze_fixation' for real-time eye-tracking analysis. This stream supports gaze tracking and fixation detection, with data sourced from either a Tobii eye tracker or a mock service.
Stream Generation
Script:
track.pyClass:
TrackerIn
application.py, thestart_streammethod creates aStreamThreadthread with user-defined parameters and starts it:self.stream_thread = StreamThread() self.stream_thread.set_variables(tracker_params=tracker_params) self.stream_thread.start()StreamThreadspawns aTrackerThread, which instantiates and runs aTrackerobject fromtrack.py.
Stream Creation: When
push_stream=True(set via GUI checkbox),Trackercreates an LSL stream or when--push_streamflag is passed in CLIpyeta track --push_streamChannel modification: Combines raw, filtered, and metadata into a 22-channel array.
Applies
OneEuroFilterto raw gaze coordinates.Computes velocity and fixation status using
velocity_threshold.Produces
left_filtered_gaze_x,left_filtered_gaze_y,right_filtered_gaze_xandright_filtered_gaze_yused for fixation detection.
OneEuroFilter Algorithm
This algorithm reduces noise and jitters in raw gaze data while preserving responsiveness, which is later used for fixation detection.,
Steps:
Derivative: Calculates rate of change:
current_derivative = (current_value - self.previous_value) / time_elapsedDerivative Smoothing: Applies fixed cutoff (1.0 Hz):
alpha_derivative = self.smoothing_factor(time_elapsed, self.derivative_cutoff) filtered_derivative = self.exp_smoothing(alpha_derivative, current_derivative, self.previous_derivative)Adaptive Cutoff: Adjusts based on velocity:
adaptive_cutoff = self.min_cutoff + self.beta * abs(filtered_derivative)Value Smoothing: Applies exponential smoothing:
alpha = self.smoothing_factor(time_elapsed, adaptive_cutoff) filtered_value = self.exp_smoothing(alpha, current_value, self.previous_value)
Stream Properties
LSL stream Name:
'tobii_gaze_fixation'Channels: 22
Channel Structure of the stream is described below:
Left Eye
1
left_gaze_x
gaze
normalized
Raw X gaze position (0-1)
2
left_gaze_y
gaze
normalized
Raw Y gaze position (0-1)
3
left_pupil_diameter
pupil
mm
Pupil diameter
4
left_fixated
fixation
boolean
Fixation status (True/False)
5
left_velocity
velocity
px
Gaze velocity
6
left_fixation_timestamp
timestamp
s
Time of fixation start
7
left_fixation_elapsed
duration
s
Fixation duration
8
left_filtered_gaze_x
filtered_gaze
normalized
Smoothed X gaze position
9
left_filtered_gaze_y
filtered_gaze
normalized
Smoothed Y gaze position
Right Eye
10
right_gaze_x
gaze
normalized
Raw X gaze position (0-1)
11
right_gaze_y
gaze
normalized
Raw Y gaze position (0-1)
12
right_pupil_diameter
pupil
mm
Pupil diameter
13
right_fixated
fixation
boolean
Fixation status (True/False)
14
right_velocity
velocity
px
Gaze velocity
15
right_fixation_timestamp
timestamp
s
Time of fixation start
16
right_fixation_elapsed
duration
s
Fixation duration
17
right_filtered_gaze_x
filtered_gaze
normalized
Smoothed X gaze position
18
right_filtered_gaze_y
filtered_gaze
normalized
Smoothed Y gaze position
Screen Data
19
screen_width
screen
px
Screen width
20
screen_height
screen
px
Screen height
21
timestamp
timestamp
s
Data timestamp
22
local_clock
timestamp
s
Local system clock
Stream reading for plotting
Script:
reader.pyClass:
StreamThreadResolves and connects to
'tobii_gaze_fixation'Continuously pulls samples
Upon requests, gaze data is being parsed.
Last updated