MWORKS: Detecting touch with multiple fingers on iPad

Hi Chris,

I’m Huidi from Earl Miller’s lab, and I’m training monkey to do task on ipad. I found that sometimes the monkey would use multiple fingers to touch the screen simultaneously, but mworks cannot detect this kind of touch (i.e. touch position x and y are not updated). I wonder if there’s any solution to this problem. Thanks!


Hi Huidi,

Apologies for the delayed reply.

At present, if the subject puts multiple fingers on the screen simultaneously, MWorks recognizes and tracks only the first finger to touch the screen and ignores the others. While it would be possible to track all of the fingers, it’s not clear to me how that would work or if it’s really a good idea. Some questions that come to mind:

  • How should MWorks report location(s) for a multi-finger touch? Should it report the average location of all the fingers? Should it report all the locations separately? Should it report only the location of the finger that touched most recently?

    If we report multiple locations, it’s not going to work well with MWorks’ existing fixation point stimuli.

  • How do you prevent a subject from “succeeding” at a task by just splaying their hand across the screen (and thereby touching whatever target is associated with success)?

    I suppose your experiment could just check for “on the correct target and NOT on any incorrect target”, but that might get a complicated and/or tedious.

If you have suggestions for how this should work, I’d be happy to hear them!