-
Posts
3,907 -
Joined
-
Last visited
-
Days Won
269
Content Type
Profiles
Forums
Downloads
Gallery
Everything posted by Rolf Kalbermatter
-
The internal kernel APIs used by the OpenG PortIO kernel driver were discontinued by Microsoft in 64-bit Windows. By the time I looked into this Microsoft also had started to require signing for kernel drivers to be installable. I have no interest to buy a public certificate for something like 100$ a year just to be able to sign an open source driver. That extraction of the kernel driver from the DLL is in theory a neat idea but in practice will only work if the user executing that function has elevated mode. Otherwise kernel driver installation is impossible.
-
Well, the reverse engineering was quite some work as I had to view many different existing CDF files and deduce the functionality of them. But creating a new one for the sqllite library itself should be straight forward by taking the one for lvzip and changing the relevant strings in the CDF file. Only thing besides that is that you need to generate also your own GUID for your package as that is how Max identifies each package. But there are many online GUID generators that can do that for you, and the chance that that will conflict with any GUID already in use is pretty much non-existent. Depending if you also want to support sqllite support for Pharlap and VxWorks you would have to either leave those sections in the CDF file or remove them. You can also handle dependencies and depending on what libraries you need they may actually exist as packages from NI already. Adding such dependencies to your CDF file is basically adding the relevant GUID from that other CDF file and version number to your own CDF file. Works quite well but if NI doesn't have the package available already, things can get more complicated. You either have to create an additional CDF file for each such package or you could possibly add the additional precompiled modules to the initial CDF file. A CDF file can reference multiple files per target, so it's possible but can get dirty quickly.
-
The LabVIEW deployment of VIs to a target does not copy over shared libraries and other binary objects to the target system. (Well it did for Pharlap targets but that was not always desirable. Originally Pharlap's Win32 implementation was fairly close to the Win NT4 and Win2000 API and many DLLs compiled for Windows NT or 2000 would work without change on Pharlap too. But with newer Windows versions diverting more and more from the limited Win32 support provided by Pharlap ETS and unless you went to really great lengths, including NOT using newer Visual Studio versions to compile your DLL, you could forget that a DLL would work on Pharlap. In addition LabVIEW on Windows has no notion of Unix shared libraries and so they simply removed that automatic deployment of such binary files to a target, even if that file would be present in the source tree. The possible implications and special requirements of how to install VxWorks OUTs and Unix SOs on a target system were simply to involved and prone to problems.) So to get such files properly put on the target system you need a separate installer. NI uses for their software libraries the so called CDF (Component Description Format) (not sure if they developed that themselves or snatched it from somewhere π). For the LV ZIP Toolkit I "reverse engineered" the NI CDF file structure to create my own CDF file for my lvzlib shared library. When you place that and the according shared library/ies into a folder inside <ProgramFilesx86>\National Instruments\Shared\RT Images, NI Max will see this cdf file and offer to install the package to the remote target if the supported targets in the CDF file are compatible with the target for which you selected the Install menu. This works but is made extra complicated by the fact that a normal user has no write access to the Shared directory tree. So if you add these files to your VI package, you must require VIPM to be started as administrator. Extra complication is that VIPM does not offer the "Shared/RT Images" path as a target option, so you would need to do all kind of trickery with relative paths, to get it installed there. I instead used InnoSetup to create a setup program containing those files and configured to require administrator elevation on startup. I then start this setup program from the PostInstall step in the package. But NI has slated CDF to be discontinued sometimes in the near future and to require NIPM packages instead to be used.
-
How does VIPM set the VI Server >> TCP/IP checkbox?
Rolf Kalbermatter replied to prettypwnie's topic in LabVIEW General
The fact that VIPM does require a restart of LabVIEW is probably a good indication as to what it does. 1) enumerate all installed LabVIEW versions 2) read the labview.ini in that location and extract the VI server settings 3) if user changes the settings for a LabVIEW version, update the LabVIEW.ini file and require a restart On restart LabVIEW reads the LabVIEW.ini file and all is according to what VIPM wanted it. Simple and no hidden vodoo magic at all. -
NI abandons future LabVIEW NXG development
Rolf Kalbermatter replied to Michael Aivaliotis's topic in Announcements
https://www.theregister.com/2021/01/05/qt_lts_goes_commercial_only/ One more issue about using QT unless you are fully commercial already anyhow. -
Yes, except that without WinSxS you had one problem, with WinSxS you end up having several more problems! π While maintaining binary compatibility across library versions requires some real discipline, abandoning that and trying to solve it with something like WinSxS is simply getting you further down the rabbit hole.
-
Welcome back. Enjoy it!
-
XWindows is not the main problem, although admittedly not the most easy API to tackle UI drawing. I think QT would be a lot easier but there you have licensing issues if you are not a truly open source project. But XWindows as an API is pretty standardized and has very few of the problems Shaun alluded too. But XWindows is just a basic UI API. Once you want to tackle things that are more user related you end up with a myriad of different desktop management standards that are all fundamentally different and while there were some attempts to standardize things they are usually not much more than recommendations that different projects adhere to in very different ways. Also another problem in the Linux world are various system interfaces that sometimes change on an eye wink and also can greatly vary between Linux distributions. And if they don't change entire subsystems just because they can, it is quite common to make binary incompatible API changes that require often specific version of libraries and aren't upwards compatible at all. So if you want to install library XY you end up needing library Z, V, A and T all in specific versions except that library ZV needs some of these in different versions and if your project depends both on XY and ZV you are screwed. That each of those Linux distributions uses their own packaging system doesn't help this at all either. It's not just the repositories that vary but also the format, dependency tracking and all that. If you want to develop an application like LabVIEW that should support all major Linux distributions you end up spending countless hours of debugging and code tweaking to work around all these kind of trouble only to have to do the whole work again 6 months from now.
-
The problem with the argument about better mergeability if LabVIEW would use a text based file format is that XML and really any format that can represent more complex data structures, can not easily be merged with current tools. It doesn't even fully work for normal text based languages as the merge tool can NOT determine what to do when two code modifications occurred at the same or immediately adjacent line location in a text file. You end up with conflicts that have to be resolved manually. While that is doable although not fully painless for normal text based languages, XML and similar formats pose an additional challenge since a change in one element can actually cause changes in multiple line locations, often separated by many text lines. As soon as two different changes are somehow not totally independent in line location, you can't merge just based on line information. And in the worst case a single change can basically mean a modification right in the beginning of the file and one at the end. With such a change any other change is potentially very problematic to merge. And just doing the standard conflict resolution for text files as it is done now, with including both sections and letting the user manually decide which to use, would basically just create a double as big file, which would be pretty useless to try to manually merge. So while converting LabVIEW binary files to XML or similar sounds like a good idea, it won't solve the problem of merging in source code control really. In a way the current situation is easier: You know it can't be done automatically so are forced to plan accordingly! XML may in that respect seem to many to promise something that could not be delivered anyways in most merge situations. To do any meaningful merging for anything but the most trivial modifications in XML format, one would have to create a diff/merge tool that fully understands XML and to make matters worse, it should also be aware of the actual schema used in the XML file. Even then creating an XML format from that merge result would be only partly helpful, one would really need an interactive merge visualizer that presents the XML data structure to the user directly in a suitable graphical form such as a tree view and lets him select the relevant modification when there are any logical conflicts. As to creating external tools for dealing with LabVIEW files without the need to script things in LabVIEW itself, the idea sounds good, but I doubt it will be used a lot. Maybe a translation tool for different languages but many such things are rather runtime features than edit time features. About throwing away LabVIEW for non-Windows platforms: That is hardly an argument to even start to try to remain relevant in comparison to things like Python or in a far far future from now maybe .Net. Python runs on pretty much every hardware you can imagine that has a C compiler available, from low end 8-bit microprocessors to extreme high end system. There are also literally exclusive points in that list. * Stop adding new features to LabVIEW for several years This directly contradicts several other items in that list.
-
G Interfaces for LabVIEW 2020
Rolf Kalbermatter replied to Aristos Queue's topic in Object-Oriented Programming
To be honest, I would probably not put them anywhere in that view. Itβs called Class View for a reason. π It hadnβt really occurred to me that you would want to have the non-class VIs visible in there. Is that a flaw or just out of the box thinking? -
Poll: Should the CLA Exam require applied knowledge of OOP?
Rolf Kalbermatter replied to Mike Le's topic in LabVIEW General
Point in case: LabVIEW NXG! -
NI abandons future LabVIEW NXG development
Rolf Kalbermatter replied to Michael Aivaliotis's topic in Announcements
QT from a pure technological view might be a good choice. From a licensing point of view it might be quite a challenge however. Qt | Pricing, Packaging and Licensing That all said, they do use a QT library in the LabVIEW distribution somehow. Not sure for which tool however, but definitely not the core LabVIEW runtime. If it is for an isolated support tool, it is quite manageable with a single user license. For the core LabVIEW runtime, many of the LabVIEW developers would need a development license. And some of the QT paradigms aren't exactly the way a LabVIEW GUI works. Aspect ratio accurate rendering would most likely be even further from what it is now with even bigger differing results when loading a VI frontpanel on different platforms. -
NI abandons future LabVIEW NXG development
Rolf Kalbermatter replied to Michael Aivaliotis's topic in Announcements
The cross platform backend they definitely have already. It exists since the early days of multiplatform LabVIEW whose first release was version 2.5 on Windows. It was originally fully written in C and at times a bit hackish. The motto back then was: make it work no matter what even if that goes against the neatness of the code. In defense it has to be said that what is considered neat code nowadays was totally unknown back then and in many cases even impossible to do, either because the tools simply did not exist or the resulting code would have been unable to start up on the restrained resources even high end systems could provide back then. 4MB of RAM in a PC was considered a reasonable amount and 8MB was high end and costing a fortune. Customers back then were regularly complaining about not being able to do create applications that would do some "simple" data acquisition such as a continuous streaming of multiple channels at 10kS/s and graphing it on screen with a PC with a "whooping" 4MB of memory. The bindings to multiple GUI technologies also exists. The Windows version uses Win32 window management APIs, and GDI functions to draw the content of any front panel, Mac used the MacOS classic window management APIs and Quickdraw and later the NSWindows APIs and Quartz, while the Linux version uses for everything the XWindows API. These are quite different in many ways and that is one of the reason why it is not always possible to support every platform specific feature. The window and drawing manager provide a common API to all the rest of LabVIEW and higher level components are not supposed to ever access platform APIs directly. But moving to WPF or similar on Windows would not solve anything, but only make the whole even more complicated and unportable. The LabVIEW front end is built on these manager layers. This means that LabVIEW does basically use the platform windows only as container and everything inside them is drawn by LabVIEW itself. That does have some implications since every control has to be done by LabVIEW itself but on the other hand trying to do a hybrid architecture would be a complete nightmare to manage. It would also mean that there is absolutely no way to control the look and feel of such controls across platforms. You have that already to some extend since LabVIEW does use the text drawing APIs of each individual platform and scales the control size to the font attributes of the text as drawn on that platform, resulting in changed sizes of the control frames if you move a front panel from one computer to another. If LabVIEW would also rely on the whole numeric and text control as implemented by the platform those front panels would look even more wildly different between computers. And no, front end moved into the IDE is not a solution either. Each front panel that you show in a build application is also a front end part so needs to be present as core runtime functionality and not just an IDE bolted on feature. -
HTTP Post does not work
Rolf Kalbermatter replied to Thang Nguyen's topic in Remote Control, Monitoring and the Internet
HTTP Error 401: https://airbrake.io/blog/http-errors/401-unauthorized-error#:~:text=The 401 Unauthorized Error is,client could not be authenticated.&text=Conversely%2C a 401 Unauthorized Error,to provide any such authentication. Somehow your HTTP server seems to require some authentication and you have not added any header to the query that provides such authentication. -
No. Static libraries are a collection of compiled object modules. The format of static libraries is compiler specific and you need a linker that understands the object file format. That is typically limited to the compiler toolchain that created the static library! Some compilers support additional formats but it is often hit an miss if it wlll work to link foreign libraries with your project.
-
G Interfaces for LabVIEW 2020
Rolf Kalbermatter replied to Aristos Queue's topic in Object-Oriented Programming
Well, it would be along the line of Visual Studio then. There you have the sub-view, typically located on the left top side with the Solution Explorer, Class View and Resource View tabs (all configurable of course so it may look different in your installation, but this is the default setup.) It's not like you can't claim that the LabVIEW project window drew quite a bit of inspiration from that Solution Explorer window in Visual Studio. π -
NI abandons future LabVIEW NXG development
Rolf Kalbermatter replied to Michael Aivaliotis's topic in Announcements
That could be because itβs not available (and never may be). VHDL sounds like a good idea, until you try to really use it. While it generally works as long as you stay within a single provider, it starts to fall apart as soon as you try to use the generated VHDL from one tool in another tool. Version differences is one reason, predefined library functions another. With the netlist format you know youβll have to run the result through some conversion utility that you often have to script yourself. VHDL gives many users the false impression that it is a fully featured lingua franca for all hardware IP tools. Understandable if you believe the marketing hype but far from the reality. -
NI abandons future LabVIEW NXG development
Rolf Kalbermatter replied to Michael Aivaliotis's topic in Announcements
LabVIEW Community Edition (or your Professional License with the Linx Toolkit installed) can be deployed to Raspberry Pi and BeagleBone Black boards. It's still a bit of rough ride sometimes and the installations seems to sometimes fail for certain people for unclear reason, but it exists and can be used. -
NI abandons future LabVIEW NXG development
Rolf Kalbermatter replied to Michael Aivaliotis's topic in Announcements
It's back online. Probably some maintenance work on the servers. -
I am taking a sabbatical from LabVIEW and NI R&D
Rolf Kalbermatter replied to Aristos Queue's topic in LAVA Lounge
Congratulations on your new endeavour. Working for a space program is one of those dreams every technically inclined boy probably has at some point. It actually even tops the dream of working for Lego. π -
Industrial EtherNet (EtherNet/IP)
Rolf Kalbermatter replied to siva's topic in Remote Control, Monitoring and the Internet
This thread is about Ethernet/IP, a specific protocol used in industrial automation. While it might be possible to install Eternet/IP support on your PLC, there is nothing in the product description of Epec's controller hardware that mentions anything about it. Most controllers support CAN with CanOpen, and only very few controllers have even an Ethernet port. But all the literature mentions is that it allows integration into IoT networks, which would probably indicate that it supports some HTTP or WebSocket protocols, but hard to say for sure, how they support Ethernet connectivity in their controller software. -
Yes it could if this driver was developed in LabWindows/CVI. And there are some parts that get installed with LabVIEW that were in fact developed in LabWindows/CVI and therefore will cause the runtime engine to be installed. LabVIEW runtime itself does not need it however so its logical that it does not install them. Where did you deduce that it needs the 4.0.1 version of the CVI runtime? That is an awfully old version released about 25 years ago. I would guess that the DLL needs a newer version unless it was released in the late 90ies of last century. Edit: I just see that the copyright is 1997 to 2001 so your version might be in fact not very much off. Although a never LabWindows/CVI should work too. LabWindows/CVI did not use version specific runtime libraries and the functions were usually kept backwards compatible.
-
It's not really weird. The NI Analysis library is only a thin wrapper around the Intel Math Kernel library. And that library does all kinds of very low level performance enhancing tricks. Part of that is that it determines at loading what CPU, number of cores and which SIMM extensions the CPU supports to tune its internal functions to take maximum advantage of the CPU features for performance reason. It so happens that the detection code for this trips over its feet when presented with an AMD Ryzen CPU and rather than falling back to a safe but not that performant option it simply aborts the DLL loading which in turn aborts the LabVIEW wrapper DLL loading. LabVIEW doesn't know why the loading of the DLL failed, just that it did fail. Of course, anyone suspecting any malevolence in that failure of detecting a competitors CPU, really must be ill minded. π Realistically I suspect it wasn't malevolence, but the fact that it wasn't for their own CPUs didn't quite make the urgency to fix it very high. And NI doesn't update the AA library every few months to the latest and greatest MKL release, as that has some serious implications in terms of testing effort.
-
If the driver itself only consists of this single DLL, then that still does not exclude the possibility of dependencies, but a little different than what I deduced from your first post. Every C(++) compiler will use a C (and in case of C++ also a C++) runtime library for all the standard functions that a programmer expects to be able to use. This runtime library is similar to the LabVIEW runtime engine. While it is possible to include the relevant C runtime functions together into the DLL, this is often (and in Visual C by default) not done to make the DLL not contain redundant code that is elsewhere available on the system. Now most C compilers have long ago started to use version specific C runtimes. If your DLL was created with Visual Studio 2010 for instance, it is set to depend on the Micosoft C runtime version 10.0. A Visual Studio 2012 executable or DLL is dependent on Microsoft C runtime version 11.0. Each C runtime has its own set of DLLs that need to be present on the target system in order to load a DLL or EXE created in the according Visual Studio version. Microsoft Windows comes pre-installed with some C runtime installations and that can vary over time since various Windows tools are compiled with various Microsoft C compiler versions. And LabVIEW and many of the NI tools are created in various versions of Visual Studio too. So what most probably happens is that on a full LabVIEW IDE install some of the libraries or tools installed uses the exact same runtime version as your DLL. On a normal LabVIEW runtime installed machine this specific NI tool or library is not necessary and hence not installed and therefore loading of your DLL must fail. Your DLL manufacturer needs to document in which version of Visual Studio the DLL was created to let you download the according Microsoft C runtime installer from the Microsoft side (or provide that installer together with their DLL).