NIDAQ device?

Hi Chris -

Hope all is relatively well with you amongst all this 2020 drama.

I have a quick (and naive!) question for you please:

I think I need a NIDAQ device that is compatible with MWorks. If so - is there a particular part number you would recommend? I was looking at this onehttps://www.ni.com/pdf/manuals/372101a.pdf but I really don’t know much about this!

Alternatively, is there anything you would recommend over a NIDAQ device?

Thank you!
Yasmine

Hi Yasmine,

National Instruments’ support for macOS has always been very limited and is ending entirely in the near future. However, there may be devices from other vendors that will do the same job.

The part you referenced is just a breakout box. It takes the 68-pin output from an NI DAQ and splits it into a bunch of BNC and spring-terminal connections. I assume you’re also interested in a DAQ to go with it? If so, what are the capabilities that you need?

Also, if what you’re looking for is just general-purpose I/O, is there a reason why the I/O capabilities of your VIEWPixx display wouldn’t be sufficient?

Cheers,
Chris

Hi Chris -

Good to know that national instruments support for macOS is ending.

I was under the impression that I need something like a DAQ to converge,
digitize and timestamp all the experimental events (eye signal, laser
signal and other display events) so that it they get saved alongside the
neural data - but I dont know how I’m going to do this exactly as I’ve
never set up an ephys rig from scratch.

I did think about using the ViewPixx IO briefly - but I think that using it
that way will mean running cables into the rig and then back out of the rig
again to the computers which could get ugly.

Maybe I should talk to Najib more about how they do all this. But it sounds
like from your perspective moving away from National Instruments is the way
to go, yes?

Yasmine

Hi Yasmine,

I was under the impression that I need something like a DAQ to converge, digitize and timestamp all the experimental events (eye signal, laser signal and other display events) so that it they get saved alongside the neural data

Yes, that’s correct, although the exact details will depend on what specific devices you’re using. Discussing this with Najib would probably be helpful, as I think he’s put together setups like this several times at least.

I did think about using the ViewPixx IO briefly - but I think that using it that way will mean running cables into the rig and then back out of the rig again to the computers which could get ugly.

I’d definitely recommend using the VIEWPixx if possible. It’s very capable as a DAQ; something equivalent from NI would cost several thousand dollars at least. Also, unlike most DAQ hardware, its software API (i.e. the code MWorks uses to communicate with and control it) is very good. Plus, you’ve already paid for it, so you may as well make the most of it!

But it sounds like from your perspective moving away from National Instruments is the way to go, yes?

Absolutely. Since NI has stopped supporting macOS, using NI hardware with MWorks really isn’t an option anymore.

Cheers,
Chris

Super. Thanks for your feedback on this Chris! I really appreciate it!

Cheers,
Yasmine

Hi Chris -

Im zooming with Sophie from ViewPixx today about using the monitor for
timestamping capabilities instead of national instruments.

I also talked to a colleague on the floor who uses ViewPixx/DataPixx for
this purpose and his overall impression is as follows:

(1) As long as MWorks does not need to take in any analog signals, then I
can use my ViewPixx monitor as is (without the $7K upgrade to unlock Analog
I/O). My impression is that MWorks can take eyelink data (if I want to pass
it that) as digital input, and doesn’t need analog input - unless I someday
want to run closed-loop type experiments (e.g. evolving stimuli based on
firing rates?). Is all that correct?

(2) I could feed all analog signals to Blackrock, and simply use the DB25
digital out connector from the ViewPixx monitor to provide the timestamping
signal - thereby bypassing the need for a NIcard.

Does all this sound correct to you? I might have a few more questions after
talking to Sophie - but I think that if I can do all of the above without
the $7K upgrade I might go that route until I need that feature. I
definitely like the idea of trying evolving stimuli - but that could be
next year/etc.

Many thanks!
Yasmine

Hi Yasmine,

My impression is that MWorks can take eyelink data (if I want to pass it that) as digital input, and doesn’t need analog input

Correct. The typical (and best) way to interface with an EyeLink is to connect the EyeLink PC and the Mac running MWorks via an Ethernet cable. This lets you use MWorks’ builtin EyeLink interface, which receives all-digital data from the tracker.

When you say “if I want to pass it that”, do you mean that your experiments don’t do any fixation monitoring? If they do, then you definitely need to get the eye-tracking data in to MWorks (preferably via the builtin, digital interface).

unless I someday want to run closed-loop type experiments (e.g. evolving stimuli based on firing rates?)

I’m a little confused here. Wouldn’t firing rates be determined by the Blackrock PC, not the EyeLink?

In any case, if you want to do this type of closed-loop experiment, you will definitely need to send the firing rates or any other required data back to MWorks. How you send it depends on where the data is coming from. If they’re coming from a Blackrock system, then I don’t know what your options are for sending the data back to MWorks. (Perhaps Najib or your other colleagues would have that info.)

I could feed all analog signals to Blackrock, and simply use the DB25 digital out connector from the ViewPixx monitor to provide the timestamping signal - thereby bypassing the need for a NIcard.

Yes, that’s exactly what I was thinking, too.

I think that if I can do all of the above without the $7K upgrade I might go that route until I need that feature

That makes sense. As you noted, unless you need to get analog signals in to MWorks (and it’s not clear to me that you do, even for a closed-loop experiment), then there’s probably no reason to pay for that upgrade.

Cheers,
Chris

Thanks Chris! I really appreciate it.

Quick follow up comments and one question below.

(1) Yes - we will pass Eyelink data to MWorks via Ethernet. The Mac Pro
tower has two ethernet ports, so I’m assuming one will be connected to
Eyelink, and one to the ethernet jack. Correct.

(2) Yes - forgive the confusion. For a closed loop experiment, firing rates
would have to be sent from Blackrock back to MWorks. Not sure how - but
will ask Najib.

(3) Talked to Sophie from VPixx - and she agreed that the best way forward
is to use the monitor’s digital output to send a timestamping signal to
Blackrock. Seems like the simplest solution.

(4) Sadly I discovered that my Blackrock system doesn’t have enough analog
inputs to support my experiments. It only has 3 analog inputs and I need 3
for eyelink and 4 for the laser (optical sensor from each of 4 laser
diodes). So I’ll work with blackrock on upgrading the system so I have more
analog inputs.

Just one more (naive) question for you please:

Shouldn’t the digital output of VPIXX (DB25 pin connector) also carry a
copy of all the digital signals sent from MWorks to VPIXX? And if so -
shouldn’t I be able to send those MWorks signals from VPixx to Blackrock
via the same cable that will carry the timestamping signal? That way I can
drop signals like stimulus start/stop/etc into the blackrock file.

Thanks Chris!
Yasmine

Hi Yasmine,

Shouldn’t the digital output of VPIXX (DB25 pin connector) also carry a copy of all the digital signals sent from MWorks to VPIXX?

It’s better to say that MWorks sends digital signals via the VIEWPixx (rather than to the VIEWPixx). MWorks tells the VIEWPixx to set particular lines on its digital output port (i.e. pins on the DB-25 connector) to high or low, and those digital signals are sent to other devices (e.g. a pump for dispensing juice or a Blackrock system for recording).

And if so - shouldn’t I be able to send those MWorks signals from VPixx to Blackrock via the same cable that will carry the timestamping signal? That way I can drop signals like stimulus start/stop/etc into the blackrock file.

You can send any info you want to the Blackrock via the VIEWPixx’s digital outputs, as long as that info can be encoded as some combination of single bits and multi-bit words. You’ll use some subset of the VIEWPixx’s output pins to transmit the timestamp. (The number of pins you need depends on how many bits you want for your timestamp.) Any remaining pins can be used to transmit other info.

Note that it’s possible to configure pins on the VIEWPixx’s digital input port as outputs. (In MWorks, this is done via the use_input_port parameter.) That gives you up to 24 additional digital output lines to use.

Also note that if you’re going to use the VIEWPixx’s pixel output mode to record the value of the upper left pixel, that consumes all of the 24 available lines on the digital output port. However, you can still configure digital input pins as outputs.

Cheers,
Chris

Glad to hear that all of this is possible (thanks to your hard work!).

Sophie from ViewPixx was excited to hear that you’ve implemented the
support for ViewPixx displays and the DataPixx I/O hubs. I know she’s
interested in zooming with you (and perhaps me?) in the coming weeks to
learn more about this implementation.

Would you be up for it sometime?

Many thanks,
Yasmine

Sophie from ViewPixx was excited to hear that you’ve implemented the support for ViewPixx displays and the DataPixx I/O hubs. I know she’s interested in zooming with you (and perhaps me?) in the coming weeks to learn more about this implementation.

Would you be up for it sometime?

Sure, that would be fine.

Chris