Calibration Algorithm

Hi Barnes,

Thanks for posting the XML.

My expectation was that I should see a rough grid of fixation positions which loosely match the fixation cue target locations.

That’s what I would expect, too. I’m not sure why your plots don’t match that expectation, but I have a few thoughts:

  1. I’m not sure “last 100ms” is the right window for looking at the eye positions. What you really want is to see the eye positions between begin_calibration_average (in state “cal fixation”) and end_calibration_average_and_take_sample (in state “cal success”). That’s the time window where you’ve determined that the animal is fixating and are collecting eye positions that will be used to compute the calibration.

  2. Your experiment uses eye_lx and eye_ly as the “raw” eye coordinates, but it really should be using pupil_lx and pupil_ly. See this discussion for more info. Presumably this isn’t having a big effect (if any), since you’re able to calibrate successfully. Still, I don’t know how the EyeLink assigns values to eye_lx/ly in the absence of an EyeLink-side calibration, so it’s probably better to start with the true raw data (pupil_lx/ly) when calibrating in MWorks.

    Also, the bit where it says there may be a non-linear relationship between raw (pupil) data and true gaze position is interesting. It’s hard to image the data being sufficiently non-linear to explain your plots, since you can successfully calibrate using MWorks’ linear eye calibrator. Still, it’d be worth checking if/how things change when you switch to pupil_lx/ly.

  3. My recommendation to compare eye_h/v_raw and eye_h/v_calibrated doesn’t make sense during a calibration. The calibration protocol starts by executing Clear Calibration, meaning that the raw and “calibrated” values will be identical until you complete the calibration (except for gains and/or offsets you’ve set manually via MWClient’s eye calibrator window, which are included in the “calibrated” values). I was thinking that you’d compare them after completing a calibration, to get a sense for how the computed calibration transformed the raw eye coordinates.

    If you want to see how the raw eye positions used to compute the calibration are transformed by the calibration, then you’ll need to manually apply the final calibration parameters to the raw data and plot that. I assume you haven’t done this already, because, apart from a scale change, your raw vs. calibrated plots look identical.

  4. At some point, it might be worth trying a tracker-driven calibration. This is described in the docs as well as this discussion. The lab that requested this feature seemed pretty happy with it, but your results may vary.

Sorry I can’t provide any definitive answers. Hopefully some of the above suggestions will help you move forward.

Cheers,
Chris