Saving eye positions

Hi Yoon,

I am presenting videos and wondered if there is a stereotypical pattern of eye movements according to the content of the video. Do you have any suggestions or a code snippet that would help me reconstruct the gaze pattern post hoc?

I’m not sure exactly what you want. Are you hoping to draw an eye trace (similar to what you see in MWClient’s eye window) on top of each video, so that you can see where the gaze is as the video plays? If so, the event file should have all the information you need, but making use of it will take some work.

You can get the details of what part of a video is playing at a given time from #stimDisplayUpdate. The timestamp of an SDU event is the predicted time when the update should start appearing on the display. You can compare this directly to the times of eye_h and eye_v events and thereby plot the eye position (or eye trace) for each frame of the video.

The value of an SDU event is a list of dictionaries, each of which contains the parameters of all stimuli displayed during that update. For a video file, the filename parameter tells you which video was playing, and the current_video_time_seconds parameter gives you the timestamp in the video of the frame being displayed. You could use this in concert with MATLAB’s VideoReader to extract and display the frame. (There’s probably some equivalent in Python, but I don’t know what it is.) You’d also want to use pos_x, pos_y, and size_x to position and scale the video, so that you can draw the eye trace correctly on top of it.

Once you have an image of each video frame with a superimposed eye trace, you should be able to put them together in to a movie. Something like MATLAB’s movie function should work.

Does this sound like what you want?

Also, is there anything to be aware of if I adjust the eye tracker’s sampling frequency (e.g., data_interval = 2 ms)? I would like to get a sense of the minimum sampling frequency required for my analyses.

The data_interval parameter just sets how often MWorks polls the EyeLink for new data. If you increase its value, you’ll still get the same number of eye samples, but MWorks will receive them in larger batches. If you really want to change the tracker’s sampling frequency, you’ll need to do that on the EyeLink PC.

As for whether it’s wise or helpful to change the sampling frequency, I really don’t know. Maybe the EyeLink docs could provide some guidance?

Cheers,
Chris