Jump to content

Rolf Kalbermatter

Members
  • Posts

    3,778
  • Joined

  • Last visited

  • Days Won

    243

Everything posted by Rolf Kalbermatter

  1. So you expect Microsoft to let you download Visual Studio 2005 (or 2008, 2010, 2013, 2015) for free because you have somewhere a Windows XP machine running that runs a very expensive experiment where the hardware drivers don't support anything newer? Bad luck for you but that is not an option (unless you want to resort to bogus download sites where you have to search for the right download button among a few dozen others trying to get scareware or worse onto your system, and/or the download itself is bogus or even malware invested). You can buy a Microsoft Visual Studio Professional subscription however for ~1500 bucks a year and then you can actually download (most) of them. Or you can of course download these packages when you have the license to access them and then safely archive the installer somewhere. If you build such a specialist system and don't plan for having to sometimes in the future reinstall the whole stuff again (on new hardware or after a fatal crash) you only did part of the job for that project! That works for LabVIEW (and many other software) the same, if there is even a possibility to download older versions!
  2. That is a tricky recommendation, as you explain too. Unless it is documented that the DLL or particular function is thread safe, I would normally refrain from calling the function in any thread. Also your claimed speed improvement is very off, especially in this case. The main time loss is when the LabVIEW program has to wait for the UI thread to be available for calling the CLN. This is at worst in the order of a few ms, typically even significantly below 1 ms. Compared to the runtime of these functions, which for the capture does a GDI bitmap copy and some extra shuffling to only get out the actual pixel data and for the MoveBlock(), which does a memory copy of a significant size, this delay of arbitrating for the UI thread is pretty small. It's likely measurable with the high resolution timer, but do you really care if this VI takes 25 ms to execute or 26 ms? Reentrant execution of a CLN gets really interesting performance wise when you have a short running function which is called over and over again in a loop. Here the overhead of arbitrating each time for the UI thread will be significant and can get even the single most dominating factor in the execution speed. There is one other aspect about not wanting to run a CLN in the UI thread. While LabVIEW is busy executing the CLN in the UI thread, this thread is not available to do anything else, including drawing anything on the screen and processing Windows events such as mouse and keyboard events. If the CLN takes a long time to execute (many 100 ms to seconds) the LabVIEW application will appear to be frozen during the call to this CLN (and in the worst case Windows will notice after several seconds that the event loop isn't polled for a long time and provide the user with a dialog suggesting that the application is unresponsive and should probably be shut down).
  3. What does the documentation say about the pointer returned by the function? Is it allocated by the function? Is it a static buffer that is always the same (very unlikely)? If it is allocated by the function, you will need to deallocate it after use (after the MoveBlock() function), and the documentation should state what memory manager function was used to allocate the buffer and what memory manager function you should use to deallocate it, otherwise you create a memory leak every time you call this function. Ideally the DLL exports a function that will deallocate the buffer, still a usable solution is if they use the Windows GlobalAlloc() function to create the buffer in which case you would need to call GlobalFree(). Pretty bad would be if they use malloc(). This is because the malloc() call that the DLL does might be linking to a different version of the C runtime library than the according free() you will try to call in LabVIEW, and that is a sure way to create trouble.
  4. Right click on a string control (or constant) and select "Visible Items->Display Style". The control gets an extra glyph which is marked with the red rectangle in the image of Mikael. This indicates what display mode the control uses. Note that this changes ABSOLUTELY nothing about the actual bytes used in the string. It only changes how the control displays them!
  5. Icons are the resource sink of many development teams. 😀 Everybody has their own ideas what a good icon should be, their preferred style, color, flavor, emotional ring and what else. And LabVIEW actually has a fairly extensive icon library in the icon editor since many moons. You can't find your preferred icon in there? Of course not, even if it had a few 10000 you would be bound to not find it, because: 1) Nothing in there suits your current mood 2) You can't find it among the whole forest of trees
  6. Well, I have done a few minor updates to the library to fix a few minor problems with newer Windows systems. It's still active and used by some customers. What do you want to know about it?
  7. Oops, sorry! 😀 But you found it already. 👍
  8. Where you refering to this? https://lavag.org/profile/19157-jordan-kuehn/
  9. I would consider that a bug in VIPM. It should not rename the DLL and linkage name in the VI in this case or at least offer an option to disable-that. But your solution with the Post Install works and is what I always do. VIPM always lacked a few features to properly handle 32-bit and 64-bit binaries. I haven’t checked in the latest version but there was no way to distinguish files to be only installed for 32-bit or 64-bit only and even if the latest version would support that I’m hesitant to use it as VIPM tends to create packages that won’t be installable by older versions of it.
  10. A star before the file ending is translated to 32 resp 64, a star for the file ending is translated to the respective platform specific shared library file ending. Is your shared library name different between bitnesses? Otherwise you do not need a star at all before the file ending.
  11. If you make that libmuparser-lv.* you can leave the default file ending for the shared libraries. .dll for Windows, .so for Linux and .framework or .dylib for Mac OS. And you don’t rename the default name of libraries under Linux but typically create a symlink with whatever non-versioned name you need.
  12. It’s actually how Microsoft wants DLLs distributed with an application unless it is installed directly into the system (or GAC for .Net assemblies). For DLLs that get directly referenced by LabVIEW VIs it is not important as the application builder adjusts the path to whatever relative directory you told it to move the DLL but for other DLLs that LabVIEW and/or the application builder doesn’t know about it is important to place the DLL in a standard Windows searched location.
  13. NI System Link, NI TestStand, NI Measurement Link, NI MultiSim/Ultiboard (barely) , NI LabWindows/CVI (pretty much not anymore), MATRIXx (not sure they still sell that other than to legacy customers), Diadem (not actively sold anymore), and a few others that they have stopped working on. Other than LabVIEW, TestStand and System Link, NI has pretty much stopped any other software development, so part of the down turn is likely that sales from those other software is dwindling as existing customers jump the boat and no new customers are boarding for them.
  14. Well, the whole NI=>Emerson transaction seems to go as follows: 1) Shareholders from Emerson have approved the deal 2) Emerson created a wholly owned subsidiary in Deleware called Emersub CXIV, Inc for the whole purpose of merging with NI 3) Shareholders from NI approved the merger on June 29, 2023 4) After all the legalities have been dealt with National Instruments and Emersub CXIV, Inc will merge into a new company under the name of National Instruments, and Emersub CXIV, Inc will cease to exist. The end result is that National Instruments for a large part will most likely simply operate as is and be a fully owned subsidiary of Emerson Electric but for a lot of things simply keep operating as it did so far. If and what technical cross contamination will eventually happen will have to be seen. You could probably compare it to how National Instruments dealt with Digilent and MCC when they took them over. They both still operate under their own name and serve their specific target audience and for a large part were unaffected by the actual change in ownership. There were of course optimizations such as that most of the MCC boards where eventually actually manufactured and shipped from the same factory that also produces NI hardware. Digilent also has eventually taken over some of the products from NI that were mainly meant for the educational market such as myDAQ but also the Virtual Bench device which they sell under a different name but it is 100% the NI Virtual Bench device and also works with the same drivers.
  15. There is no simple fix to that. In order to draw anything on a Windows GUI you do need GDI objects. Every Window is one, every subwindow, every icon or bimap is one, every line or arc could be one, every text can be even two or three. You can open them, draw them to the device context and close them afterwards but opening/creating them costs time so if you foresee to use them again it’s a pretty smart idea to keep them around instead of spending most of your program execution time in creating and destroying them continuously. The only thing LabVIEW programmers can do is to try to combine more operations into a single object and/or finding the objects that are rarely needed and pay the runtime performance penalty to recreate them each time instead of keeping them around. Also I’m pretty sure that 3D controls with alpha shading may look cool to some people but tend to increase the GDI object count substantially. Classic and Dialog controls are a lot easier to draw!
  16. And the Scan From String does a greedy pattern match, this means the first %s will eat the entire string as any character matches %s and then there is nothing left for the other 7 %s to process => error 85 as the Scan from String can't satisfy the second %s. If you have a space as separation character as it seems you have from the format you show you should rather use following Scan From String format string %[^\s]\s%[^\s]\s%[^\s]\s%[^\s]\s%[^\s]\s%[^\s]\s%[^\s]\s%[^\s] For this you should also right click on the string constant and enable '\' Codes Display The format string basically says, take every character until you see a space and put this in the first output parameter, then take the space and repeat this for the remaining 7 output strings. Possibly 23 character could also mean that each substring consists of exactly 2 characters plus the 7 separation characters. If so you could also use: %2s\s%2s\s%2s\s%2s\s%2s\s%2s\s%2s\s%2s
  17. You’re saying that an event registration terminal should behave like a tunnel? They are not the same thing as the name already says (and the symbol shows)! It’s also more like a shift register although the right side of a shift register can’t be unwired. And the behavior makes sense. For tunnels it can be a pitta to have to wire each output but it’s also desirable that it can be easily “reset”. And that behaviour was defined ca. 1986! For event registrations it is almost never needed to connect the right side terminal (I did that maybe once or twice in my entire LabVIEW programming). Alternative would have been to make only a terminal on the left side but then you couldn’t dynamically register an event. Inconsistent? Maybe if you take a puritan look, ,useful? Definitely! And I can assure you the LabVIEW team debated over this probably several weeks if not months before deciding on the current solution. The only improvement I could think of nowadays is to allow to hide the right side terminals!
  18. That would be a management nightmare. There would be no possibility to handle more than one event registration to subscribe to the same even generator without very involved buffer management. By placing the event queue into the event registration object it is fairly simple. 1) Hey here is my event queue, post any event you get to it. 2) I only have to check that queue to see if there is any event to handle. Not some other queue that might or not might have events present that I already processed but need to remain in it because not all event registrations have yet gotten to read that event. There is a relatively cheap check needed in the event generator to see if the event registration queue is still valid before trying to post a new event to it but that’s much easier than turning the single.queue into a pseudo random access buffer for use with multiple registrations. Similarly the event registration needs to check that the event generator is still valid when trying to deregister itself from it on its own destruction but that’s again a fairly simple Is Not an Object/Refnum check.
  19. That would make little sense. That event queue is owned by the registration refnum, and the registration refnum registers simply a callback in the event object. The event is able to have multiple callbacks registered as it "queues" them up internally. That's what makes it possible to have multiple registration refnums connected to the same event object. That makes resource management fairly easy, avoids polling and objects owning other objects. Once the event goes away it simply will stop posting any events to the registered callback (it forgets about the callback since it is non-existent). The event queue in the event registration object still exists until that event registration object itself is destroyed. And if you close the event registration refnum earlier it simply deregisters itself from the event.
  20. It's not a bug and there is nothing that could be fixed here. The user event is not the same as the event registration! The user event is an event object that manages events in an internal FIFO queue to pass the event over to the event wait node AND/OR to any number of attached event registration handlers. When it is registered with the register node, a new independent object is created that contains its own event queue and that event registration is then connected to the event with an event registration handler that simply passes the event to this queue in addition to the internal event queue. Once the event is gone, all the events that are stored in the internal queue of the event are of course gone, but the events also were passed to the queue of every attached event registration queue. That queue remains valid until the event registration refnum is closed itself. The event registration is special as it can attach to any object that can expose an event interface, such as ActiveX objects, or .Net classes too (if they have defined events). Also LabVIEW front panel objects actually can provide an event interface and their refnum can be connected to a Register Event node.
  21. crossrulz is very correct. VISA only supports TCP/IP and that in two flavors. The first with the ::INSTR postfix is according to the VXI-11 standard that provides a resource discovery service (ala Bonjour) on the device that allows NI Max to query for connected devices and query their specific settings. The second variant uses the ::SOCKET postfix and is raw TCP/IP communication. The resource specifier needs in that case also a port number in addition to the IPv4 device address. UDP communication in LabVIEW is only possible with the UDP native nodes in LabVIEW. No NI-MAX will be getting harmed nor in any other way getting involved in this!
  22. Those handle functions only really worked on the Mac, since they were actually creating native MacOS handles, not the LabVIEW own flavor of them that was a sugar layer around the rather arcane Macintosh memory manager calls. Yes the Macintosh memory manager was famous for being not high performant and quite crash sensitive. It worked for most Macintosh applications sort of, but LabVIEW really was considered a stress test for it. It was also regularly stressing the Macintosh Programmer Workshop C compiler that the LabVIEW team used to build the product. It was quite a regular happening that NI had to request Apple support to build a new version of it with bigger symbol table space, in order for the compiler not to fatally crash during build of some of the larger C source files. When the multiplatform version was started some of the real cracks in the LabVIEW team actually prefered to develop and debug on the Sun Solaris version of LabVIEW, since the compiler and debugger tools where a lot more powerful on there, even though everything was in fact all command line based. And yes LabVIEW 2.5.x was Windows 3.1 only. It was never tested nor build to run under Windows 95 or later, which would have been very difficult considering that it was released in fall 1992. The first version supporting Windows 95 natively was AFAIK LabVIEW 4.0. LabVIEW 3.1 or thereabout did however have a version that could be installed on Windows NT 3.1.
  23. I used to have a whole box of disks and CD ROMs of old versions, though nothing before 2.2.1 which was Mac Only. But after 30 years and 3 major relocations, almost all of this has eventually ended up in the big round archive called trash bin. It's just not practical to hold onto these forever, and that stuff actually weights quite a bit altogether. Not fun to carry around and also not fun to fill up storage space (that I seem to always have not enough).
  24. Why do you think you need to downgrade NI-VISA? Does something not work? Have you tried anything? While you indeed usually need to have a NI driver version that is not more than 3 years higher than the LabVIEW IDE you are using, that is somewhat less of a program for NI-VISA. First most NI VISA functionality in LabVIEW is built in, so there are not a lot of VIs that an NI VISA installer could add to your LabVIEW installation. Maybe the example VIs, but I'm not even sure about that. Second, the NI-VISA API itself has been very stable over many years and there is little reason to believe that you would run into compatibility issues with a bigger version difference. Basically on the development system where you run the LabVIEW IDE, you should make sure that the versions do not differ to much, but on remote systems where you only intend to execute built applications, almost every version of VISA would work, but it is always a good idea to use at least the same version or newer as you used to develop/built your program.
  25. Post your LabVIEW project for the DLL, the DLL and its header. And please do a save for previous for2020 or earlier.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.