Hi Meg,
I’ve attached a first pass at your Simon experiment.
There are versions for Mac and iPad. The former uses the mouse/trackpad as input, while the latter uses the touchscreen.
I didn’t implement the two-color mode or user feedback. Maybe you can try to add those, and I can help if you get stuck.
Hopefully, this will serve as a solid starting point that you can develop further into a production experiment. If you have any questions, please don’t hesitate to ask.
Cheers,
Chris
simon.zip (321.9 KB)
@mpotta
Before diving into the code, I’d like to ask a quick question after playing with the iPad version. Is it possible to reduce the latency to accept touch responses faster?
Currently, the protocol
- waits until no touches are detected (i.e. no finger on screen) before checking for button presses (states “Wait for no input”, “Wait for button press”), and
- waits until each button sound completes before accepting another button press (state “Button sound ended”).
If either of these is the source of the latency you’re experiencing, then you could speed things up by removing the waits.
Apart from that, no, there’s no way to make touches more responsive – MWorks just gets the touch events from the OS, and there’s nothing it can do to get them faster. FWIW, I did not notice any apparent latency when testing the experiment on my iPad.
Cheers,
Chris