Jump to content

Alexander Kocian

Members
  • Content Count

    5
  • Joined

  • Last visited

Community Reputation

0

About Alexander Kocian

  • Rank
    LAVA groupie

LabVIEW Information

  • Version
    LabVIEW 2013
  • Since
    2016
  1. Hello Currently, I stream and proecess audio (for medical purposes) on my PC (i7-4790T with 8 cores) using LABVIEW 2013. To improve performance, the 8 cores could be shared between MS Windows and the real-time operating system RTX by IntervalZero. Please, how can I tell LABVIEW to use the (deterministic) RTX cores instead of the (stochastic) MS Windows cores, to stream audio?
  2. Ned, your method works. Only the declaration int I had to change against __int64. thank you. Now, I understand how Labview stores clusters. rolfk, my VisualC compiler says that the syntax MgErr is unknown.Do I need to consider a special library?
  3. Thank you for giving me feedback. Creating C-File by Call Library Function Node gives Therefore Labview also crashed when I change the function prototype from to Once this simple example works, the task will be to bundle a complex combination of data types.
  4. The task is to process a bundle by a DLL embedded in LABVIEW. LABVIEW crashes even if the coded is most simple. For example, I created an array of double, bundled it and plugged it into the DLL. The parameters of the DLL are "Adapt to Type". The prototype in Labview is "void pointertest(void *source);" The C program: // dllmain.cpp : Defines the entry point for the DLL application. #include "stdafx.h" #include <stdio.h> #include <stdlib.h> #include "Header.h" extern "C" __declspec(dllexport)void pointertest(LVCluster *source); BOOL APIENTRY Dll
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.