Jump to content

Rolf Kalbermatter

Members
  • Posts

    3,909
  • Joined

  • Last visited

  • Days Won

    270

Posts posted by Rolf Kalbermatter

  1. QUOTE (Neville D @ May 23 2008, 11:18 AM)

    I seem to remember something about some dll called "tulip" that allowed NI Visa to talk to Agilent hardware.. this was a long time (10yrs or so) ago..

    I think it came from Agilent, but check the NI Knowledgebase about it. I stuck with the NI stuff and was able to get Agilent hardware to work with it.

    Neville.

    That is the VISA passport for HPIB boards if I'm not mistaken.

    Rolf Kalbermatter

  2. QUOTE (crelf @ May 22 2008, 07:38 AM)

    So it's wings have been clipped to work with only one key to assign licenses? That's a shame.

    Well there might be all kinds of ramifications. For instance NI acquired a source code license to FlexLM for certain OSes (most probably Windows only) and certain uses (most probably allowing to protect their software with it but not allowing them to create a licensing Toolkit for use by people outside of NI). Macrovision (seems they are now Acresso) has been a well known company in copy protection business and I'm sure they employ quite capable lawayers too, so if I was NI I wouldn't try to go beyond what the acquired license allows, which I believe has certainly its price even with a non royality free, non free to use for everything license :rolleyes: .

    Rolf Kalbermatter

  3. QUOTE (crelf @ May 20 2008, 02:42 PM)

    :question: I wonder how difficult (possible) it would be to include products in the NI License Manager. I know that some of the products we distribute are LabVIEW addons, so we could assume that the NI License Manager is installed, but I wonder if it even supports products external to NI...

    Let's put it like this: The FlexLM core in LabVIEW and other NI products is an extensible system. However the way it is build into LabVIEW it assumes a specific secret key to sign licenses. So in order to generate your own licenses you would need to know that key and NI certainly will not publish it since it is secret. And even if you knew it, use of it would likely be against one or more laws to protect copyright such as the DMCA and probably some others.

    Rolf Kalbermatter

  4. QUOTE (Yen @ May 20 2008, 01:12 PM)

    Like Rolf, I use a 15.4" Dell Latitude and I find 1440x900 to be a good resolution (although the height would be a problem if I wanted to do UIs which were larger than 1024x768).

    By the way, I'm considering getting EEEs for systems where we need operator stations. With XP and its low price it should beat any other option I can think of (PDAs, PLCs with external touch screens, laptops, etc.).

    Well mine is 1280x800. Put it on my age and the fact that my eyes aren't as sharp anymore :rolleyes: .

    The sreen resolution problem I usually solve by making my panels scalable anyhow. Not the LabVIEW autoscaling, mind you! Just some own scaling so that specific parts on the screen scale while others stay put relative to the rest.

    Rolf Kalbermatter

  5. QUOTE (eaolson @ May 19 2008, 02:19 PM)

    Quote
    As Dawkins put it "If atheism is a religion then not collecting stamps is a hobby." To be fair, atheism and agnosticism are fairly squishy words that can mean largely what you want them to mean. There is the strong atheism, weak atheism, etc. That's not entirely limited to a disbelief in god; you'd get a very different if you asked John Hagee if Catholics and Mormons were really Christians than if you asked a Unitarian.

    I'm not a Christian, but I'm open to reevaluation after the Rapture. :rolleyes:

    I'm not sure where you read about atheism being a religion in the post you replied to. I read about it being a belief and I have to agree with that. Nobody can proof there is a deity nor that there isn't, so even atheists believe in something :D

    Rolf Kalbermatter

  6. QUOTE (Yen @ May 19 2008, 01:12 PM)

    Indeed. LabVIEW here only checks the OS error after doing socket calls and translates that error into it's own error number. And it was decided that a more granular error reporting is the good thing to do and I agree with it.

    Another angle might be that the OP is not really interested into the actual connection at all but rather into the information if the network node is reachable at all. This is solved easily with a network ping as described in following http://forums.ni.com/ni/board/message?board.id=170&thread.id=70801&view=by_date_ascending&page=1' target="_blank">link.

    I include a copy of the fixed version of that library. There is however one principal problem with LabVIEW's multithreading and the error reporting as done with sockets:

    The WSA error is maintained on a thread specific global variable and for the blocking call when waiting on an answer from the remote node and in case of parallel execution of this utility in general this causes a small issue. So for this select call the reported error in case of a failure never will be the real error since the select call happens in one of the threads inside a multi threaded execution systems while the error retrieval will occur in the UI thread. Also if you run multiple ping calls in parallel, the error from one call may get overwritten by the execution of another call before the first VI had a chance to retrieve it's associated error.

    The only real fix for this would be to write a wrapper DLL that does the actual socket call and error retrieval together in one function per different socket call (since the actual call to an external code routine is guranteed by LabVIEW to be atomic in terms of the actual calling thread inside LabVIEW.

    I would like to add that the VIs are LabVIEW 7.0 and the original code is from m3nth on the NI forums.

    Rolf Kalbermatter

  7. I've got a Dell Latitude D830 with 15.4" display and generally like it. In the office I use a docking station and an external 22"wide screen display too. Don't have to do lot of work on the road but I do use the laptop only mode at home and find it quite workable with this 15 inch screen. But I chose on purpose for a low 100 dpi instead of the maximum available 148 dpi because I hate those super tiny fonts and even more the Windows hack of Big fonts to work around that and it also matches the resolution of normal desktop LCD monitors of 90 dpi better.

    Rolf Kalbermatter

  8. QUOTE (gmart @ May 16 2008, 04:44 PM)

    I understand SVN's development model is different. In general have you worked with an IDE (not client like Tortoise) that doesn't get in your way when using SVN (for example, Visual Studio)? Is the check in/out model such an impediment that even with the third-party SVN plugin for LabVIEW, you feel your productivity is diminished?

    Well it's not about for example Visual Studio or not but about if that IDE uses the SCC API. That API was specifically designed around the strict check in/check out philosophy and accordingly uses, enforces and even requires it. And that does not work well with SVN. Visual Studio certainly is not a very good IDE to use with SVN since it does really on the SCC API for its Source Code Control integration.

    As others have said there are other IDEs that are a lot more flexible in how they interface to SCC systems and that work a lot better with SVN than Visual Studio.

    Rolf Kalbermatter

  9. QUOTE (mkaravitis @ May 16 2008, 04:56 PM)

    We are using a Mightex CCD line camera. Their dll and example code in VC++ has the following:

    -------------------------------------------------

    // Structure

    -------------------------------------------------

    typedef struct {

    int CameraID;

    int ExposureTime;

    int TimeStamp;

    int TriggerOccurred;

    int TriggerEventCount;

    int OverSaturated;

    int LightShieldAverageValue;

    } TProcessedDataProperty;

    -------------------------------------------------

    // Function

    -------------------------------------------------

    void _cdecl FrameCallBack( int __, int __, int __, TProcessedDataProperty* Attributes, unsigned char *FrameData )

    {

    // Code here...

    .....

    ....

    ...

    ..

    .

    }

    -------------------------------------------------

    // dll function

    -------------------------------------------------

    SDK_API CCDUSB_InstallFrameHooker( int __, FrameDataCallBack FrameCallBack );

    The structure (TProcessedDataProperty) is called in the function (FrameCallBack), which in return is called as an argument in the dll function (CCDUSB_InstallFrameHooker). This dll funcion is third party vendor dll who don't have LabVIEW support, but is necessary to use this dll in order to access the CCD camera data.

    We have implemented the structure as a cluster in LabVIEW. But how do we implement the following in LabVIEW?

    (1) the FrameCallBack function that takes a pointer to structure argument (TProcessedDataProperty* Attributes), and

    (2) how do we pass the pointer to FrameCallBack function as an argument to the CCDUSB_InstallFrameHooker dll function?

    --

    Thanks,

    Gautam.

    Callback functionality is not supported byt the Call Library Node. The most simple solution is to write a wrapper DLL in C that provides that callback function translating the callback event into a LabVIEW occurrence or LabVIEw user event and a LabVIEW callable function to install that callback function.

    All in all not something you are likely to solve without some good C programming knowledge.

    Rolf Kalbermatter

  10. QUOTE (Michael_Aivaliotis @ May 16 2008, 01:10 PM)

    Thank You! C'mon people. none of you have seen code like this?

    I think the hardware vendors should start outsourcing the development to LAVA members.

    My thinking exactly. However quite some of those manufacturers compete with NI mainly on one base and that is being cheaper than NI so I guess that leaves not much space to spend money for real software development especially since NI hardware has been getting more competitive in price too in the last years.

    Rolf Kalbermatter

  11. QUOTE (Tomi Maila @ May 16 2008, 12:37 PM)

    I'm still not sure I can see the need for ms and frame accuracy on playback. However what we did so far was synchronization and combined storage of video and data acquisition http://www.citengineering.com/pagesEN/products/sdvd.aspx' rel='nofollow' target="_blank">Synchronized Video & Data Acquisition Device in order to have time accurate life measurements such as blood pressure, ECG and similar together with the actual video recoding of the operation so that these things can later be exactly localized in relation to the actual action taken at that moment. However playback of this video together with the data is usually not really in real time and definitly not in strict realtime as the researcher normally wants to go through the interesting sections in real slow motion.

    Rolf Kalbermatter

  12. QUOTE (Tomi Maila @ May 15 2008, 10:22 AM)

    Hi,

    We've to write a medical application that needs to display video (a long sequence of pictures) on a LCD/CRT screen or a projector in deterministic manner. No frames may be skipped, the timing of the frames needs to be exact within about a few ms accurancy. I tried to google about determinism of video signal but couldn't find much anything. I tried to browse NI products but no-one talks about determinism of the video output. I tried to look for the specifications of PC graphics cards but also there is no information about the determinism of the graphics cards. We experimented a little with a Windows PC running LabVIEW and it seems that some frames of image sequence get skipped for still an unknown reason.

    Does anyone have experience on deterministic video displaying. Any suggestions to any direction?

    Er! The big question here is for what is this good? The human eye has a very limited time resolution so what is it that makes you or your customer believe that the actual display of every single frame to a very accurate time position is so important and not just the overal speed of the movie to the original timeline?

    Basically Windows is not real time and any other desktop OS neither. So they are more or less inherently unable to guarantee a whole video frame being transmitted every 40ms (25 frames per second) accurately to a time scale of only a few ms. So any normal video playing software just simply synchronizes the video stream timeline continously to the actual time, skipping frames whenever appropriate. On lower level (for instance when you control the Quicktime API directly but DirectX/Direct Play surely has similar capabilities) you can opt for frame accurate display instead of time accurate display but that usually means that the timeline of the playback is not synchronized with the original timeline anymore as it sooner or later starts to have a time lag.

    I do not see any way to guarantee both frame and time accurate display of movie material on non-dedicated hardware other than simply buying the greatest and highest performance hardware components, installing a hardware video decompressor that supports your video compression and preferably has a direct link (crosswire or dedicated PCIe channel) and the meanest and leanest OS you can possible get your hands on and keep your hands crossed that no system interrupts such as network traffic or other DMA transfers will mess up your timing anyhow.

    Now with dedicated hardware such as embedded systems with specially optimized RT OSes for media solutions this might be a different story.

    Rolf Kalbermatter

  13. QUOTE (marp84 @ May 16 2008, 03:41 AM)

    hai bro i has finish my simple project then i want to make my project form .vi to .exe or apllication program.please help me to convert labview project to .exe or apllications program and i use labview 8.0? after that i want the .exe or apllication program can access from website how i can do that please tell me bro or give me some example.

    tanx before

    You need the Professional Version of LabVIEW or the Application Builder add-on in order to do that. Then read the User Manual about how to go about creating an executable.

    Rolf Kalbermatter

  14. QUOTE (Gary Rubin @ May 15 2008, 10:48 AM)

    My experience is that these are usually so poorly written (i.e. overuse of sequence structures) that we end up rewriting them amost from scratch anyway. The Labview SDK's that I've used are often only useful as examples to show the order in which dll functions should be called and how their inputs need to be formatted.

    Now you are exagerating a bit :rolleyes: . I mean I've seen those "drivers" and they usually come from companies producing some hardware and wanting to make it available to LabVIEW users but they do not have a professional LabVIEW programmer and sometimes even just use the evaluation version of LabVIEW to create their drivers. It's in general a very bad idea to do since the technical support requests those companies create in such a way is huge and they have obviously no resources to support that. Which depending on the customer means: he is writing his own driver or abandones LabVIEW or the hardware in favor of a different product -> both cases result in a dissatisfied customer.

    Now I do write VI libraries too and develop "drivers" regularly. Some of them are openly available, some even free and I would hope that those libraries/drivers would not fall under your category of poorly written "LabVIEW SDKs". They definitly almost never use sequences and if they do it is for data dependency only and nothing else. ;)

    That there are people that want to still rewrite them may be true but I would like to think that that has more to do with the "Not invented here" syndrome than anything else and I have to admit that I have been going down that path at times in the past too.

    Rolf Kalbermatter

  15. QUOTE (tengels @ May 14 2008, 07:52 AM)

    When trying to communicate over a serial port I get "port settings conflict" error in the MAX.

    After selecting the correct serial settings (57600 Baud, 8bits, no parity 1 stop bit) and saving the port settings the device still doesn't working correctly.

    As I open a VISA session in MAX I see in the "View all settable Atributtes...." that the baudrate did not changed to 57600 Baud. Only when writing in the VISA session the correct baudrate with the property node (write) the communication for that session works.

    However when refreshing the MAX or leaving the MAX the settings are changed back to 9600 baud.

    When programming in LabVIEW I can't get a reply back. It looks like that the application uses the wrong baudrate.

    (Used several examples from NI where baudrate is configured.)

    Does some one encounderd this problems before and is there a work arround for it?

    I'm using LabVIEW 8.5

    MAX 4.2.1.3001

    NI-VISA 3.1

    Hope someone could help me

    Theo

    What is the serial interface? A USB to Serial converter? If so it may be a problem in its driver which VISA does not know how to deal with properly. I've seen strange behaviour with several USB to Serial adpaters in the past.

    Rolf Kalbermatter

  16. QUOTE (crelf @ May 14 2008, 01:41 PM)

    No worries :) If I remember correctly (and I usually don't), there are three colors: yellow = thread safe, orange = thread unsafe, white = LabVIEW can't determine, so it runs it thread-unsafe.

    Never saw the white so far and for the Call Library Node it wouldn't make sense anyhow. LabVIEW can not determine if an external shared library or particular functions in it are reentrant safe. The programmer defines that in the Call Library configuration dialog (and if he says it is reentrant the according function better is or you are in for strange to rather nasty effects).

    RolfKalbermatter

  17. QUOTE (Michael_Aivaliotis @ May 15 2008, 03:12 AM)

    I think the best LabVIEW driver is not really called a driver at all, but is a specific Instrument Class using LVOOP.

    What is best and what not is very discutable. LVOOP is most probably not such a bad thing but such a driver restricts its use to applications that can and will use LVOOP. Also just as with normal VI libraries the usefullness and ease of use depends greatly on the person implementing that class. You can make a mess with (LV)OOP just as easily as with normal VI library interfaces and in fact even easier since you need to understand OOP fairly well to really deliver easily reusable class implementations.

    I'm sure this is biased by experiences with some C++ code which can be sometimes called horrible to understand at best but it's nevertheless a real experience and is also a result of my mind which likes visual representation very much but has much more affinity with a functional interface than with some of the more complex OOP design patterns.

    Rolf Kalbermatter

  18. QUOTE (BrokenArrow @ May 13 2008, 01:41 PM)

    So Dan, you have experience seeing TCP being faster than Shared Variables? ;)

    I have seen the somewhat counter-intuitive behaviour of the TCP routines being a lot faster than Shared Variables in Development Mode, but once an EXE was made, the TCP approach only yielded a modest speed advantage. There's a lot going on under the hood of Shared Variables (variant VI's and whatnot), but maybe once it is compiled.... ?

    Not sure about shared variables but TCP can be made fast in LabVIEW. And you do not even need to go on raw socket level. Just get a small VI from the NI site to disable the Nagle algorithme for a TCP network refnum and you are done without delays for small data packets making command-acknowledge type protocols getting slow.

    As to being compiled, as far as LabVIEW is concerned there should be little difference between development system and runtime system performance. If there is a big improvement the application builder would have to do something on the SV engine level that would be very spooky at best.

    Rolf Kalbermatter

  19. QUOTE (Gary Rubin @ May 13 2008, 01:56 PM)

    I agree with that. If a VI is calling a device-specific DLL, I wouldn't call that VI a driver; the DLL would be the driver. I'd probably call a VI a driver if it was performing the hardware interface using only native Labview and OS functions (i.e. memory peeks/pokes).

    I don't think you can draw the line that strictly. Very strictly speaking the device driver is nowadays the piece of software that translates user application level requests into hardware specific cammands and address accesses. And that piece has to reside inside the kernel as kernel mode device driver since that is the only way to directly access hardware in nowadays protected mode OSes. However talking to that kernel device driver directly is tedious at best so they usually come with a DLL that provides an easier to use API and can be considered part of the driver as well. But with that I do not see any reason to exclude the LabVIEW VIs that access that API as being part of the driver either. After all they translate the not so easy to use DLL calls into something that can be used much more easily in LabVIEW.

    And once you are there why not qualify any collection of VIs that translates access to some form of hardware in something more LabVIEW friendly as a driver too?

    I wouldn't go as far as calling VIs to access the normal OS API as drivers though, but that is an entirely arbitrary and subjective classification on my part.

    Rolf Kalbermatter

  20. QUOTE (maak @ May 12 2008, 02:29 AM)

    the brackets is all what I need :thumbup:

    thanks

    It's good practice to always use brackets around table names and column identifiers. This will not only catch reserved words but also identifiers that have embedded spaces, which for SQL is a syntax separator otherwise.

    Rolf Kalbermatter

  21. QUOTE (Ami @ May 12 2008, 04:00 AM)

    I'm having a problem of running multiple activeX instances using LabVIEW (apparently the problem occurs with more than 4 instances). This problem doesn't happen when I do the same thing in C (Visual Studio). I can create as many instances as I wish, but when I run methods that hang or run for a long period of time, only 4 are able to run at each moment. If I stop any one of the methods, the next one starts running. I attached an example (in LabVIEW 8.5.1) of using the excel activeX automation, but it happens with all of the activeX's I tried so far. It even happens when using several different activeXs. Please notice, that the problem is not with creating the instances, but when running methods of the activeX in parallel at the same time (If you run short methods that finish executing fast, you won't notice the problem).

    http://forums.ni.com/ni/board/message?board.id=170&thread.id=322765' target="_blank">[cross post]

    You are likely running into threading limitations. LabVIEW allocates by default 4 threads per execution system and for version 8.5 per CPU core and when executing external code that suspends execution the calling thread is blocked until the external code returns. As long as you stay in LabVIEW altogether LabVIEW will attempt to schedule multiple code sequences to run in parallel even if the 4 threads do not satisfy the need of LabVIEW directly but once in external code LabVIEW has no way of gaining control back from that thread to keep your program working among multiple external calls.

    Solution would be to avoid blocking calls to external code altogether, or disperse the different calls into different subVIs and assign them to different execution systems, or increase the number of allocated threads to your execution system in threadconfig.vi. These recommendations are in declining order of receommendation as they will be more complicated to setup and maintain in the long run and the last one will eventually burdon the system with a load that may bring it to a halt.

    Rolf Kalbermatter

  22. QUOTE (BrokenArrow @ May 9 2008, 12:59 PM)

    All of these serial devices have idiosyncrasies that often just have to be shotgunned. Still looks odd to me, the Discard. Maybe Flush Buffer would work? But hey, if it works it works! :thumbup:

    Flush Buffer will delate data already in either the receive or transmit buffer. Discard Event will discard any events that might have already been queued for the current refnum. That are two very different things.

    Rolf Kalbermatter

  23. QUOTE (kawait @ May 9 2008, 05:04 AM)

    #ifdef A_DLL_EXPORTS

    #define DLL_API __declspec(dllexport)

    #else

    #define DLL_API __declspec(dllimport)

    #endif

    DLL_API int ThisIsDllFunction();

    Unfortunately this does not specify the calling convention! You would need something like __stdcall in those two declarations too.

    If you are using Visual C however you are most likely using stdcall already since this is the default calling convention used by 32Bit Microsoft C compilers. To be sure check in the compile conficuration for your project or files what the default calling convention is.

    Rolf Kalbermatter

  24. QUOTE (netta @ May 8 2008, 12:59 PM)

    :thumbup: Thanks James.. Can't believe I missed that... I feel an utter fool now :unsure:

    Now I just need to figure out how to get all my dynamically called VIs into memory at the same time so that I can search them.... hmm... Would a mass compile do it or does it unload the vis once it's done with them?

    I usually do this by creating a Top Level.vi and putting in whatever dynamic VIs I have. Masscompile won't help since LabVIEW unloads VIs during masscompile as soon as they are not used anymore.

    The project doesn't help either because the VIs in a project are not yet loaded.

    Rolf Kalbermatter

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.