Jump to content

How much does IMAQ use the GPU?

Recommended Posts

I had always assumed the IMAQ library used cpu resources for all of its computations. Recently an NI sales rep told me some vis do in fact offload stuff to the GPU. Does anyone know which IMAQ vis use the GPU instead of the CPU? Seems like it would be useful information to know when running into vision bottlenecks.

Link to comment

I would be surprised if this was used a huge amount as this would make the libraries performance far more dependent on system architecture. They would need to run on embedded targets such as smart cameras that don't have this sort of hardware available and it would have to begin maintaining very different (internal) code across platforms.

If so I can only imagine it would have to use OpenCL for hardware independence. I can't get at the documentation right now but it would be worth looking at to see if that dependency is made.

Link to comment

Never heard of that, never read that in any doc...

Is there a way in Windows to monitor the GPU's activity?

I've used tools like GPU-z to monitor temperature and clock speeds of GPUs, but I've never heard of pulling utilization stats.

I know that there was a beta for CUDA toolkit of sorts (nVidia's flavor of GPU interface) with LabVIEW perhaps a year ago. It's no longer in the list (it was released with LV2012), but that was the first I'd heard of anything from NI using GPU resources.

Link to comment
  • 1 year later...

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Create New...

Important Information

By using this site, you agree to our Terms of Use.