-
Posts
3,871 -
Joined
-
Last visited
-
Days Won
262
Content Type
Profiles
Forums
Downloads
Gallery
Everything posted by Rolf Kalbermatter
-
LabVIEW portable pointer sized element
Rolf Kalbermatter replied to Taylorh140's topic in Calling External Code
No! What LabVIEW passes to the DLL in this case is the lower 32-bit of the 64-bit. LabVIEW does not know variable sized numerics for a number of reason. So a pointer sized integer parameter in the Call Library Node is ALWAYS transported as 64-bit integer in LabVIEW. That is totally independent on the OS and LabVIEW bitness. The Call Library Node will generate the necessary code to pass the right part of the 64-bit integer to the actual parameter. But on the LabVIEW diagram (and front panel controls) it is ALWAYS a 64-bit integer. -
LabVIEW portable pointer sized element
Rolf Kalbermatter replied to Taylorh140's topic in Calling External Code
Depends what you mean. The Call Library Node lets you configure pointer sized integers to pass data to DLLs that have pointer size. On die Diagram and Frontpanel you always use a 64-bit (un)signed integer control. That should work for almost everything. The only thing where I can think that you might need a control that changes size depending on the platform is if you want to pass clusters to a Call Library Node that contain pointers. But LabVIEW is a very strict typed programming environment and doesn't know such a control. The only solution for that is the Conditional Compile Structure where you have to program the 32-bit and 64-bit version in different frames and let LabVIEW choose the right one. -
That's about right. I mentioned the 10 year catch-up game NI has to do with LabVIEW. The full 64-bit compiler support in LabVIEW 2009 was one of the last architectural projects NI did with LabVIEW Classic. After that they did not really do much more than add some peripheral libraries, fix bugs and make minor improvements to core functionality of the LabVIEW kernel. Anything that was structurally touching the LabVIEW core code base was deferred to LabVIEW NXG. Even the Map and Set datatype in LabVIEW 2019, while a great addition, was for the most part a completely separate code base that had literally no impact on the LabVIEW kernel in terms of needing any significant changes to it. The problem NI had/has with much of the LabVIEW code base is that some of the most fundamental code modules come from a time about 30 years ago, when C++ was simply not an option at all, NI had to fight with symbol tables in the C compiler exceeding the compiler provided memory space and 8 MB of RAM in a computer was for many users considered an unacceptable minimum hardware requirement. This code is inherently C syntax, and littered with global variables that all had to be protected with mutexes when introducing multithreading in LabVIEW 5. LabVIEW NXG was about getting rid of that entirely by replacing the entire UI system with something more modern. In hindsight I think it was overambitious and NI underestimated the impedance mismatch between the .Net environment and the LabVIEW system and didn't anticipate the vocal push back from non Windows users about the fact that LabVIEW on top of .Net was likely never going to support anything other than Windows, despite their repeated claims that it COULD be made to work on other platforms. But COULD in lawyer speak usually means WON'T and a lot of those non-Windows users are academics who know a thing or two about language semantics too.
-
To precise: NI uses whatever is the locale on the machine. For Windows this is typically the normal single byte codepage for most western installations and the multibyte codepage for some of the asian languages. So LabVIEW can in principle deal with multibyte encoding. Linux since many moons has defaulted to UTF-8 as default locale for the desktop for most standard distributions. UTF-8 being in fact another multibyte code standard, this means that LabVIEW on Linux by default uses UTF-8 by the virtue of pretty much completely transparent text handling in the OS standard C library. But there are areas in the LabVIEW code that have traditionally made certain hardwired assumptions that do not work well with multibyte character sets so you are bound to see weird effects sometimes. All that said Windows does support a Beta feature since about Windows 7 that allows to set the locale to be UTF-8 instead of one of the standard codepages, but it is not for nothing still Beta. For some reason Microsoft did not feel that it warranted the necessary effort to try to iron out certain problems that still exist or it may be even pretty much impossible without more or less completely rewriting the Windows text APIs due to some deeply ingrained code assumptions. Also LabVIEW certainly is not without blame. The code dealing with text handling was mainly written in the beginnings of the 90ies of last century at a time where Windows only had codepages and Unicode UTF-16 support was still an optional feature that you had to install explicitly and was working by using a completely separate API rather than trying to shove a multibyte encoding based on the Unicode standard under the traditional text management. As such while there might be certain code sections under Linux that deal with the UTF-8 encoding more appropriately, this need hasn't arisen on Windows before NI decided to embark on the NXG adventure and pretty much deferred all improvements on existing code features in LabVIEW Classic to be solved in NXG. With NXG gone they basically have a 10 year catch up game to do.
-
while Microsoft managed C# and .Net in a better way than NI did with LabVIEW, I'm still not quite sure if I should sell my soul to them! 🙂
-
Is it that simple? If you let a gun slinger around openly for which you have a license to own, and someone else is using it to kill someone, you are still in error too and can be prosecuted for that, not for the murder in itself usually, but there are exceptions too. It will also probably depend a bit on what sort of lawyer you can afford! Even if a judge will eventually decide that your friend is the wrongdoer and therefore liable, you might still have to deal with the hassles as they try to get after you to get the necessary facts to charge that friend. I didn't know he was going to use that test program to test his new gadget he wanted to sell! sounds in any case very weak!
-
If that person owns a commercial LabVIEW version at the time she/he uses your toolkit for their commercial work, all is fine. If they don't, it is them being in violation. The example I was actually having in mind is different. If you use LabVIEW Community Edition to develop a test program for a friend, who then uses it to test a product he sells in any form, then you (more correctly him but you could be held liable) would be in violation, unless that friend has a commercial license to run and/or built the test program.
-
Yes it is virtually unthinkable that you could create a youtobe channel based on LabVIEW tutorials that will give you million $ earnings. It's as such more of a legal thought exercise than anything else. But the overall idea of the Community Edition license is: If you or someone you know makes money from your use of the Community Edition, you are in violation of the license! If that turns out to be a free meal for you, or some other such peanuts, NI would never even bother to startup their legal engine for it, but if it gets significant you will be in very hot waters. A cease and desist letter is cheap, but can be the end of any small business in an eye blink.
-
Don't get me started on Java Script. Maybe if you absolutely want to program a Discord bot for some reason, you can't really get around it. And if you want to botch your HTML documents you can use that too. But other than that Java Script uses the syntax of Java with the logic that you had with Basic. Not even one single provision to try to prevent the programmer from writing absolutely bad code, just like what we had with Basic. Community Edition is already a subscription model technically. Just with a subscription fee of $0 and the limitation that you are not allowed to use it for anything that earns you money. It expires at the end of the year and you have to reactivate it. If you use it to keep up to date with your LabVIEW knowledge that is fine, if you create open source free libraries to distribute, that is fine too. If you automate your door bell, weather station or model train layout at home, that is also allowed. But creating a Toolkit that you sell, or teaching a course for which you get paid is not a valid use case for the Community Edition. Same about writing a test application for your or someone else's company that tests some products that are then sold. An interesting corner case would be to create tutorials for your youtobe channel. If that youtobe channel ends up being watched a lot and earning you more than a few (as in some two digits, just a number I came up with but I'm sure is about what NI would consider still valid) bucks per year, you are violating the Community Edition license conditions.
-
I agree with all you said, but don't see Python as a viable competitor to LabVIEW. It's getting ubiquitous in many ways, being on every IoT device nowadays, having libraries for just about anything you can think off, even more than Java ever had. But it's still interpreted (yes I know about JIT but that is besides the point), and I really hate to do UI programming by scripts, otherwise I might have stayed with LabWindows CVI. And while crunching large data sets in LabVIEW can be challenging due to its data flow forced data copies, trying to do the same in Python is generally causing me to fall asleep or check if the computer hasn't frozen up. And it can be a challenge to find the pearls in the many Python libraries. A lot of what is posted out there is mediocre and shows that the person having done it didn't even understand what they were doing, but for some miraculous reasons managed to get to a point where it did something. If anything I see Python mainly as a competition to .Net. While not the same in many ways they are much closer to each other than any of them is to LabVIEW. The advantage (and disadvantage) of .Net is that Microsoft is behind it. They are powerful enough to make or break a platform but at the same time also do not want to lose control of it, which has caused some strange acrobatic decisions in the past about announcing to open source it, but not quite doing it anyways.
-
Yes, built applications will continue to work like they did so far. This means for the LabVIEW runtime you can install it and your executable on any computer you want and run it. Eventual Runtime licenses such as for IMAQ Vision etc, will still be per machine licenses and not leases, just as they are now. The Realtime and FPGA Module will be also a software lease but you only need them to write and build the compiled code, after that you can run them on any licensed hardware just like you do now. Deployment will of course only work from a licensed LabVIEW Development system with activated Realtime Module but once it is deployed, the target can run for as long as you want without limitations by a software lease. If they ever would change that that will be the day our company will definitely stop to use LabVIEW. We can not have a machine stop execution because the NI license server takes some vacation or some software lease has expired. The Debug Deployment licenses of the different NI software products will also stay as perpetual licenses. They are in principle the same as the Development system but you only have the right to use them to debug an application, not to write and build it. They are meant for the computers on the factory floor where you want to run your built application but may need to be able to also execute the VIs in source code to debug problems on the real hardware instead of on your development machine.
-
You still have the problem of the embedded variant in your datatype. JSON is inherently string format with some indications to make a distinction between numbers and strings but nothing more. A variant is an AnyType. What you try to do is: Convert this string (which from the JSON syntax only can be inferred to be a numeric, object, list or string, into Anything! Anything what? The JSON library could elect to convert strings into a string, numerics into a double and anything else into nothing, but that will break down as soon as you want to use that variant as a specific type as you pushed it into the incoming variant in the begin. The JSON string simply lacks every information about the actually wanted type in the variant and therefore you can not convert it into a variant without loosing significant compatibility and functionality. This is a fundamental problem that can not be solved with throwing more code at it, but only by rethinking the way you want to save your configuration data.
-
Me neither, and I can think of no use case at all for something like that. LabVIEW <-> Java bindings are not straightforward and I can not imagine anyone wanting to make that effort for something like log4j, or even a more complex Java module somehow using log4j.
-
You can always dream 😀. But what would be the incentive to lower prices once everyone who is still wanting to use that platform is locked in? A competitor maybe, but who would that be? There is nobody to be seen, unless you consider jumping the ship completely and go with other solutions, even Python maybe, which is nice but not an option for a lot of work done with LabVIEW nowadays.
-
Well, it depends a little on the duration of use. With the perpetual model you paid a hefty initial price and then a yearly SSP for as long as you wanted to be able to use newer LabVIEW versions, have access to technical support (which for a few years has been next to useless but it seems it has improved considerably in the last year). With the subscription model you pay a lower initial price but a higher early subscription price than what the SSP used to be. If you have a current SSP you can initially transition to a subscription license for the cost of your current SSP (and lock that in for to up to three years) but after that is over, you pay significantly more per year than what you did with the SSP. It seems the break even point is at about 4 years. If you intend to use LabVIEW for less than that you pay more with the perpetual model. After that the total cost of the subscription gets higher than with a perpetual license and yearly SSP payments. In a way this sounds like NI is making incentives to use LabVIEW more on pure project base and after the project is done forget about it. The software lease costs on average about double of what the SSP cost and the perpetual license without the included 1st year of SSP about the double of what the software lease costs. But you can't buy a perpetual license without at least 1 year of SSP.
-
Actually it is! If you stop paying your SSP fees you can still continue to use the LabVIEW version that was current when you stopped paying. With the subscription model if you stop paying, any software version will stop working at the end of your subscription term. No loading up an old program to fix this little bug that would anyhow be to much of a hassle to port to the greatest and latest LabVIEW version. Any LabVIEW version you have installed simply stops working. You could of course install LabVIEW with the Community Edition license then, but that violates the license agreement you entered when signing up for the Community Edition if your program is used in any professional capacity. And the Community Edition does not include things like RT, FPGA, and just about every other thing with a paid license including the Application Builder. And if NI decides to shutdown their license server altogether, or for certain older versions of LabVIEW you are equally hampered. It's unlikely that they will let you activate LabVIEW 2009 indefinitely under a subscription model. I'm not even sure if you can activate older versions if you sign up for a subscription model now. Yes it is how every software provider is heading nowadays. Greater revenues and user lockin are tempting. Once they got a user, be it Office 365, Altium and now LabVIEW the user has only the choice to keep paying or stop using the software platform altogether with all the hassles of trying to port existing solutions to a different platform which simply does the same anyhow. So the challenge is to get people to sign up and start using it, after that it is a safe business that is not so easily going away unless you totally start to f*ck them. Rising prices? If you do it in regular small steps, except for new users, you are not likely to lose many users! Nobody wants to end up with Office documents that he can't open anymore!
-
If you are in the NI Partner program you should have received an email yesterday. If not you will need to wait a little more or ask anyone in your company who signed up to the NI Partner program about this. The "like" in the message from Gleichmann may have been meant in a sarcastic tone. For most NI partners it should have little impact directly as they normally have a LabVIEW lease anyhow.
-
Thanks for the feedback. It's very much as I suspected. Network communication in LabVIEW is not really that complicated. In fact it's easier than doing the same in C. The problem is when you have to do higher level protocols such as HTTP or even more specialistic. Here you have the choice to implement the underlying protocol entirely in LabVIEW, which tends to be a major project in itself or to rely on some intermediate library like the LabVIEW HTTP Client Library which is in essence simply a thin wrapper around libcurl. The first is a lot of effort since HTTP is quite a complex protocol in itself with many variants depending on server version, authentication level and such. The second hides most of those details entirely to the point where you can't access them anymore. As a point in case I recently had to do some X Windows configuration for a RT target. Possible options: - call the xrandr command line interface tool - call the X11lib shared library to do the relevant things directly - call the xcb shared library instead to do the relevant things - implement the X protocol directly in LabVIEW I ended up using option 1, simply because it was the quickest but just for lolz I also tried the last option and got some experimental code running. Now the X Windows protocol is extensive and it would be a really serious effort to make something that is reasonably functional. Another complicated fact is the authentication because that always involves some more or less obscure cryptographic functions. Even the xcb shared library, while implementing everything else from scratch (and it is also nowadays normally used as the backend for X11lib) relies on the original auth code from X11lib for this functionality rather than trying to reimplement it itself.
-
I worry about NI hardware controller release
Rolf Kalbermatter replied to Thang Nguyen's topic in Hardware
Microsoft sells three levels of Windows IoT Enterprise: Entry Level, Value Level and High Level. They supposedly have to do with the CPU performance, so I guess number of supported cores and maybe other things like hyperthreading, amount of supported memory etc. The exact details seem only available under NDA. -
I worry about NI hardware controller release
Rolf Kalbermatter replied to Thang Nguyen's topic in Hardware
It depends what Windows IoT you are wanting to use. There are 2 different versions I believe. - IoT Core, only supports UWP applications, no Win32 applications. Also used on ARM platforms. LabVIEW as a x86/64 Win32 application can NOT run on this at all, independent if you try the IDE or built EXEs. - IoT Enterprise, full Windows platform with streamlined distribution and settings, only available for x86/64 hardware, LabVIEW applications can run on this -
Pyhton or C to do For Daq and DSP?
Rolf Kalbermatter replied to Mahbod Morshedi's topic in Calling External Code
Despite having created LabPython about two decades ago, I always prefered to go with C. LabPython (and Python itself too) are also written in pure C. One reason I think is that Python is also a high level programming language like LabVIEW. What I could do in Python I could also always do in LabVIEW, but certain things are simply not really possible (or at least not with reasonable effort) in both of them and require a lower level language. But C(++) is quite a different thing to work in for sure. It gives great power and control but that comes with great responsibilities too. While you have to really try hard to crash a LabVIEW or Python program, it's a matter of seconds to do that in C(++). This means programming in C is a little different than doing the same in LabVIEW or Python. If something goes wrong in your C program or library it is often not just an error code that is returned, but your test program simply dies without any warnings, questions or dialogs if you would maybe like to save your intermediate data results. In LabVIEW you get typically an error cluster out, look at it, determine where the problem is, fix it and start your program again, without any need to completely start LabVIEW itself again or sometimes even restart the whole computer just to be sure. Once you are used to that, it is not really much different anymore, but it is certainly something to be aware of before making the decision. -
We have noticed in the last few years that the outstanding support from NI technical support quickly detorated to the level of standard untrained technical support that call centers located in some low income countries often provide. However I have to say that this trend seems to have been reversed in recent times. I had three technical support questions in the course of about one year now, non was standard and included things that were simply not possible because the feature was officially not present. The support people were very helpful in trying to find workarounds and in two cases provided even solutions that were based on information that was gained directly from the product owners and developers to access the feature through direct behind the scene configuration files and APIs. In both cases with the strong warning that this was not the officially sanctioned way to do things and that there was a real chance that it may break in future versions, but that it was at the moment the best that could be done.
-
NI-DAQmx Error -50103: The Specified Resource is Reserved
Rolf Kalbermatter replied to Jim Kring's topic in Hardware
It's still the same. You can not have multiple tasks accessing the same DAQ card for Analog input. You need to combine the channels into one task and one scan rate and then pick out the data as needed and distribute it to the different subsystems as needed. -
Most likely that compile worker is a 32-bit application and you only have 64-bit libgdiplus installed? Or another possibility, the /usr/local/natinst/mono/lib64 directory was not properly added to the /etc/ld.so.conf file and/or ldconfig was not executed afterwards