Jump to content

Rolf Kalbermatter

Members
  • Posts

    3,280
  • Joined

  • Last visited

  • Days Won

    188

Everything posted by Rolf Kalbermatter

  1. What have you tried so far? It should not be a big problem to use this with any of the Modbus libraries out there. I usually use the NI Modbus library but I think the Plasmionique library should work too. You need to make sure to use the correct Modbus Unit ID, which Omega calls Modbus Address (Function 34) in its menu structure on the display. They need to match if you want to read any value (Read Holding Register). For writes you also can use a Unit ID of 0 which will address any connected Omega Unit but it will not return any indication if the write was successful. The default address if the Omega Units should be 1 when delivered. Then you need to figure out what register address to read and/or write. The according addresses are documented in chapter 19 of this manual.
  2. Indeed and LabVIEW DSC is on the way out. There have been no updates to it in years and support questions are blissfully ignored for a long time. HIQ was acquired in the first half of the 90ies and had the potential to compete with Mathematica and Matlab back then, but NI mainly used it to cannibalize some of its mathematical routines and add some of it to LabVIEW and then lost interest on it. Lookout was acquired around the some time. It was a very unique DSC software package with a rather interesting object oriented architecture. NI used the low level components to create its Logos network protocol infrastructure on which things like Datasocket and later Shared Variables were implemented. They also used various components of Lookout to convert the originally purely LabVIEW based BridgeVIEW system into LabVIEW DSC. After that Lookout spend its existence as a rather badly cared for step child in the NI product portfolio and eventually nobody was left over to support it anymore. LabWindows/CVI changed from yearly updates with new features added regularly to two year updates with not much more than cosmetic bugfixes for several years already. It's worse in terms of new features added than LabVIEW ever was for many years. But with LabVIEW they used the excuse that all effort was directed towards LabVIEW NXG and LabVIEW was going to be replaced by it one day. NI MultiSim used to be two products from the same company (Electronics Workbench) who were named Electronics Workbench and ULTIboard before they got acquired by NI and were at that time one of the leading EDA products in the educational market worldwide. Nowadays they are completely insignificant in the EDA market. If you have the money you will subscribe to Altium Designer, if you try to be a bit cheaper you may use Autodesk Fusion 360 or if you are an old time Eagle user then maybe Autodesk Eagle PCB and if you insist on Open Source then KiCAD will be very likely your choice (which has made large strides since CERN has decided to back it). Electronics Workbench (or NI MultiSIM) is not on that list for sure. I have used it a few times since it is part of our Alliance Member software lease but it is not up to the task of creating modern PCB designs and hence not worth the effort to learn its many specific mechanisms and bugs.
  3. Open sourcing sounds very unlikely. And what I feel here is not that they will abandon it in the next few years just like that. There is also the question about their original assurance for a software escrow of LabVIEW, that NI used when trying to convince companies to use LabVIEW in the 90ies. Never heard about that since, so it may have been not much more than a smoke screen or some empty promise, or it may still be an actual fact. But if you look at what is happening with LabWindows CVI now, it may be an indication about what they are heading for with LabVIEW in the coming 10 years. And NI's track record with acquired software and how they treated it is not very encouraging either. Ever heard of HIQ, Lookout, BridgeVIEW, Electronics Workbench and a few others?
  4. No! An rtexe is not a real executable. It is more like a ZIP archive that can not be started in itself but that needs to be started by invoking the runtime engine and passing it the rtexe as parameter. And the exact mechanism is fairly obscure and not well researched and totally not documented. Unless you are a Linux kernel hacker who knows how to investigate the run level initialization and how the LabVIEW rtexe mechanisme is added in there.
  5. But it existed since around LabVIEW 6.1. You had to add a funkyErrorWire=True or something similar into LabVIEW.ini
  6. Actually the color distinction of a wire is not specifically between a fixed size cluster and variable sized cluster and if it can be typecasted or not. Aside from adding a fixed size cluster into another cluster there is also the case where you add a boolean into a cluster. Both result in a pink wire but are still fixed size and can still be typecasted. I think the brown wire only indicates that the cluster only contains numeric scalars and can coerce into a similar cluster with the same number of numeric scalars even if their numeric type is not exactly the same. Not really a very useful distinction I believe but 35 years back LabVIEW (and most other programming languages) wasn't as strictly formalized in many aspects.
  7. Unfortunately I do think there is a strategy behind this. In the past NI was a company centered around hardware sales in the form of computer plugin cards. The fact that they were a lot better in providing good and well designed supporting software for that hardware was for years a distinguishing factor that made them grow while the competition had a hard catch up game to do and eventually all more or less faltered. The software in itself never really was the money maker, much of it even was given away free with the hardware and was considered a supporting expenditure that was financed with part of the profit for the hardware. When they had pretty much the whole market of what they possibly could get, they run into the problem that there was very little grow in this market anymore for them. So they set out to find new markets and moved towards turn key semiconductor testers that they could sell a dozen a time to the big semiconductor manufacturers for big money. Suddenly those pesky DAQ cards and DAQ boxes were just a margin anymore and they were at best a supporting technology, but the accompanying software was getting more an more a costly burden rather than a supporting technology. Nowadays there isn't one NI marketing but each division has pretty much its own marketing department and is also its independent profit center. And then an independent software/LabVIEW division suddenly shows mainly as a post in the cost category that doesn't bring in as much as it costs. So they try to increase the income but I think they missed that train already 15 years ago when they were refusing to consider other venues for LabVIEW. Nowadays the LabVIEW community is a marginalized small group of enthusiast who are looked at by the rest of the industry as a bunch of crazies who can't let go of their pet IDE and the rest of the world has moved on to Python and .Net and will move on to another hype in a few years. And the higher NI management is likely aware of that. While I do believe that the LabVIEW developers and the managers who directly work there really would like to make LabVIEW a good and flourishing product, I feel the higher management has already decided that this is going to be a dead end and have very little intentions to do anything else than let it bleed to death, so they can get rid of that liability.
  8. Would have to see the code in question. But generally I would think that you can call Functions like LocalAlloc() with the return value configured as pointer sized integer and using a 64-bit integer to transport it on the diagram/front panel. Then when passing it to an API, pass it again as pointer sized variable and when putting it into a cluster, assign it to either the 32-bit integer or the 64-bit integer respectively. There should be no problem with this since if you are on a 32-bit platform only the lower significant 32-bit should have been assigned by the Call Library Node, although there might be a sign extension happening if you happen to configure it as signed integer but that should not be a problem either.
  9. I think I misunderstood or you explained it somewhat poorly. There should be no reason to have to do any byte (word, int) swapping in this case unless you use the Typecast function to Typecast a properly sized byte array to or from the cluster. And that would be actually not the right thing to do. Typecast always will (De)Standardize a bytestream to or from Big Endian format. If you really want to pass a byte array to the function, you can instead use the Flatten To String or Unflatten from String with the type set as Native Format. Then there won't be any swapping done. Alternatively I usually create two separate clusters, one with 32 bit ints and one with 64 bit ints for the pointers. Then using Conditional Compile I setup two separate calls to 32-bit and 64-bit variants of the Call Library Node and pass the according cluster to it. And yes it is a pain in the ass to do that for more than one or two callsites, so when there are many of them and/or other nasties that don't fit well with a LabVIEW direct interface, I tend to create an intermediate shared library that properly translates between LabVIEW friendly clusters, handles and what else and the C specific API datatypes directly in C.
  10. Actually shareholders can be stakeholders too. Stakeholder is anyone who has a significant (usually directly or indirectly monetary or otherwise motivated) interest in something. 🙂 The disruptive sounds actually rather bad in my ears but it seems to have been getting a different meaning in marketing nowadays. And Synergy comes everytime up when NI acquires another company. It's one of the most used reasons why an acquisition is a good thing. You can't say of course that you simply buy up possible competition. 🙂
  11. Use "LabVIEW" (without quotes and case is important) as Library Name in the Call Library Node. LabVIEW recognizes this as referring to the current kernel that runs. Once you built an executable their is no LabVIEW.exe anymore but instead a lvrtdll.dll that exports these functions and in fact that is only a forward to the real mgcore_xx.dll since many versions.
  12. Would you care to elaborate which pointers you mean? I can't think right now of where what you say could apply.
  13. No! What LabVIEW passes to the DLL in this case is the lower 32-bit of the 64-bit. LabVIEW does not know variable sized numerics for a number of reason. So a pointer sized integer parameter in the Call Library Node is ALWAYS transported as 64-bit integer in LabVIEW. That is totally independent on the OS and LabVIEW bitness. The Call Library Node will generate the necessary code to pass the right part of the 64-bit integer to the actual parameter. But on the LabVIEW diagram (and front panel controls) it is ALWAYS a 64-bit integer.
  14. Depends what you mean. The Call Library Node lets you configure pointer sized integers to pass data to DLLs that have pointer size. On die Diagram and Frontpanel you always use a 64-bit (un)signed integer control. That should work for almost everything. The only thing where I can think that you might need a control that changes size depending on the platform is if you want to pass clusters to a Call Library Node that contain pointers. But LabVIEW is a very strict typed programming environment and doesn't know such a control. The only solution for that is the Conditional Compile Structure where you have to program the 32-bit and 64-bit version in different frames and let LabVIEW choose the right one.
  15. That's about right. I mentioned the 10 year catch-up game NI has to do with LabVIEW. The full 64-bit compiler support in LabVIEW 2009 was one of the last architectural projects NI did with LabVIEW Classic. After that they did not really do much more than add some peripheral libraries, fix bugs and make minor improvements to core functionality of the LabVIEW kernel. Anything that was structurally touching the LabVIEW core code base was deferred to LabVIEW NXG. Even the Map and Set datatype in LabVIEW 2019, while a great addition, was for the most part a completely separate code base that had literally no impact on the LabVIEW kernel in terms of needing any significant changes to it. The problem NI had/has with much of the LabVIEW code base is that some of the most fundamental code modules come from a time about 30 years ago, when C++ was simply not an option at all, NI had to fight with symbol tables in the C compiler exceeding the compiler provided memory space and 8 MB of RAM in a computer was for many users considered an unacceptable minimum hardware requirement. This code is inherently C syntax, and littered with global variables that all had to be protected with mutexes when introducing multithreading in LabVIEW 5. LabVIEW NXG was about getting rid of that entirely by replacing the entire UI system with something more modern. In hindsight I think it was overambitious and NI underestimated the impedance mismatch between the .Net environment and the LabVIEW system and didn't anticipate the vocal push back from non Windows users about the fact that LabVIEW on top of .Net was likely never going to support anything other than Windows, despite their repeated claims that it COULD be made to work on other platforms. But COULD in lawyer speak usually means WON'T and a lot of those non-Windows users are academics who know a thing or two about language semantics too.
  16. To precise: NI uses whatever is the locale on the machine. For Windows this is typically the normal single byte codepage for most western installations and the multibyte codepage for some of the asian languages. So LabVIEW can in principle deal with multibyte encoding. Linux since many moons has defaulted to UTF-8 as default locale for the desktop for most standard distributions. UTF-8 being in fact another multibyte code standard, this means that LabVIEW on Linux by default uses UTF-8 by the virtue of pretty much completely transparent text handling in the OS standard C library. But there are areas in the LabVIEW code that have traditionally made certain hardwired assumptions that do not work well with multibyte character sets so you are bound to see weird effects sometimes. All that said Windows does support a Beta feature since about Windows 7 that allows to set the locale to be UTF-8 instead of one of the standard codepages, but it is not for nothing still Beta. For some reason Microsoft did not feel that it warranted the necessary effort to try to iron out certain problems that still exist or it may be even pretty much impossible without more or less completely rewriting the Windows text APIs due to some deeply ingrained code assumptions. Also LabVIEW certainly is not without blame. The code dealing with text handling was mainly written in the beginnings of the 90ies of last century at a time where Windows only had codepages and Unicode UTF-16 support was still an optional feature that you had to install explicitly and was working by using a completely separate API rather than trying to shove a multibyte encoding based on the Unicode standard under the traditional text management. As such while there might be certain code sections under Linux that deal with the UTF-8 encoding more appropriately, this need hasn't arisen on Windows before NI decided to embark on the NXG adventure and pretty much deferred all improvements on existing code features in LabVIEW Classic to be solved in NXG. With NXG gone they basically have a 10 year catch up game to do.
  17. while Microsoft managed C# and .Net in a better way than NI did with LabVIEW, I'm still not quite sure if I should sell my soul to them! 🙂
  18. Is it that simple? If you let a gun slinger around openly for which you have a license to own, and someone else is using it to kill someone, you are still in error too and can be prosecuted for that, not for the murder in itself usually, but there are exceptions too. It will also probably depend a bit on what sort of lawyer you can afford! Even if a judge will eventually decide that your friend is the wrongdoer and therefore liable, you might still have to deal with the hassles as they try to get after you to get the necessary facts to charge that friend. I didn't know he was going to use that test program to test his new gadget he wanted to sell! sounds in any case very weak!
  19. If that person owns a commercial LabVIEW version at the time she/he uses your toolkit for their commercial work, all is fine. If they don't, it is them being in violation. The example I was actually having in mind is different. If you use LabVIEW Community Edition to develop a test program for a friend, who then uses it to test a product he sells in any form, then you (more correctly him but you could be held liable) would be in violation, unless that friend has a commercial license to run and/or built the test program.
  20. Yes it is virtually unthinkable that you could create a youtobe channel based on LabVIEW tutorials that will give you million $ earnings. It's as such more of a legal thought exercise than anything else. But the overall idea of the Community Edition license is: If you or someone you know makes money from your use of the Community Edition, you are in violation of the license! If that turns out to be a free meal for you, or some other such peanuts, NI would never even bother to startup their legal engine for it, but if it gets significant you will be in very hot waters. A cease and desist letter is cheap, but can be the end of any small business in an eye blink.
  21. Don't get me started on Java Script. Maybe if you absolutely want to program a Discord bot for some reason, you can't really get around it. And if you want to botch your HTML documents you can use that too. But other than that Java Script uses the syntax of Java with the logic that you had with Basic. Not even one single provision to try to prevent the programmer from writing absolutely bad code, just like what we had with Basic. Community Edition is already a subscription model technically. Just with a subscription fee of $0 and the limitation that you are not allowed to use it for anything that earns you money. It expires at the end of the year and you have to reactivate it. If you use it to keep up to date with your LabVIEW knowledge that is fine, if you create open source free libraries to distribute, that is fine too. If you automate your door bell, weather station or model train layout at home, that is also allowed. But creating a Toolkit that you sell, or teaching a course for which you get paid is not a valid use case for the Community Edition. Same about writing a test application for your or someone else's company that tests some products that are then sold. An interesting corner case would be to create tutorials for your youtobe channel. If that youtobe channel ends up being watched a lot and earning you more than a few (as in some two digits, just a number I came up with but I'm sure is about what NI would consider still valid) bucks per year, you are violating the Community Edition license conditions.
  22. I agree with all you said, but don't see Python as a viable competitor to LabVIEW. It's getting ubiquitous in many ways, being on every IoT device nowadays, having libraries for just about anything you can think off, even more than Java ever had. But it's still interpreted (yes I know about JIT but that is besides the point), and I really hate to do UI programming by scripts, otherwise I might have stayed with LabWindows CVI. And while crunching large data sets in LabVIEW can be challenging due to its data flow forced data copies, trying to do the same in Python is generally causing me to fall asleep or check if the computer hasn't frozen up. And it can be a challenge to find the pearls in the many Python libraries. A lot of what is posted out there is mediocre and shows that the person having done it didn't even understand what they were doing, but for some miraculous reasons managed to get to a point where it did something. If anything I see Python mainly as a competition to .Net. While not the same in many ways they are much closer to each other than any of them is to LabVIEW. The advantage (and disadvantage) of .Net is that Microsoft is behind it. They are powerful enough to make or break a platform but at the same time also do not want to lose control of it, which has caused some strange acrobatic decisions in the past about announcing to open source it, but not quite doing it anyways.
  23. Yes, built applications will continue to work like they did so far. This means for the LabVIEW runtime you can install it and your executable on any computer you want and run it. Eventual Runtime licenses such as for IMAQ Vision etc, will still be per machine licenses and not leases, just as they are now. The Realtime and FPGA Module will be also a software lease but you only need them to write and build the compiled code, after that you can run them on any licensed hardware just like you do now. Deployment will of course only work from a licensed LabVIEW Development system with activated Realtime Module but once it is deployed, the target can run for as long as you want without limitations by a software lease. If they ever would change that that will be the day our company will definitely stop to use LabVIEW. We can not have a machine stop execution because the NI license server takes some vacation or some software lease has expired. The Debug Deployment licenses of the different NI software products will also stay as perpetual licenses. They are in principle the same as the Development system but you only have the right to use them to debug an application, not to write and build it. They are meant for the computers on the factory floor where you want to run your built application but may need to be able to also execute the VIs in source code to debug problems on the real hardware instead of on your development machine.
  24. You still have the problem of the embedded variant in your datatype. JSON is inherently string format with some indications to make a distinction between numbers and strings but nothing more. A variant is an AnyType. What you try to do is: Convert this string (which from the JSON syntax only can be inferred to be a numeric, object, list or string, into Anything! Anything what? The JSON library could elect to convert strings into a string, numerics into a double and anything else into nothing, but that will break down as soon as you want to use that variant as a specific type as you pushed it into the incoming variant in the begin. The JSON string simply lacks every information about the actually wanted type in the variant and therefore you can not convert it into a variant without loosing significant compatibility and functionality. This is a fundamental problem that can not be solved with throwing more code at it, but only by rethinking the way you want to save your configuration data.
  25. Me neither, and I can think of no use case at all for something like that. LabVIEW <-> Java bindings are not straightforward and I can not imagine anyone wanting to make that effort for something like log4j, or even a more complex Java module somehow using log4j.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.