-
Posts
3,917 -
Joined
-
Last visited
-
Days Won
271
Content Type
Profiles
Forums
Downloads
Gallery
Everything posted by Rolf Kalbermatter
-
Now that's not just dreaming but fantasizing. 😀 Developing a new LabVIEW target, even if it could reuse most of the code for Windows, which I'm not sure would be the case here, is a major investment. Why would they want to do that if they already have LabVIEW NI Linux RT? And the main reason that there is no LabVIEW NI Linux RT for Desktop solution is that NI hasn't figured out (or possibly bothered to figure out) how to handle proper licensing. They do not want to enable some Chinese copycat shop to develop their own RIO hardware and sell it with the message "Just go and buy this NI software license and you are all legal". That such hardware can be developed has been proven in the past, it basically died (at least around here, maybe they sell it in tenthousends per month inside China) because such a solution did not exist and anyone using that hardware was not just operating in grey area but fully forbidden territory, with lots of legal mines in the ground.
-
No, but without a legal license for NI-LabVIEW Realtime Runtime and all the various NI-other software, it's a complete nogo for anything else than your home automation project.
-
Looks interesting. Reminds me a little of the layout principle that Java GUis have. Needs a bit getting used to when you are familiar with systems like the LabVIEW GUI where you put everything at absolute (or relative absolute) positions. And as Mikael says, those class constants are making this all look unnecessarily bulky. And they are in fact unnecessary if you build your class instantiation properly. The Class Instantiation methods should not need to be dynamic dispatch and then you can make that input optional or possibly even omit it. The control data type of your Create method already defines the class without a need for a class constant.
-
If performance is of no concern for you AND you don't care about licked GUIs for your apps, Python is a good option nowadays. How long that will be before there is a new kid on the block to whom everybody is catering to and declaring everything else as from yesterday, is of course always the big question. Python with it's automatic data type system is both easy to program and easy to botch a program with. And such flexibility has of course a performance cost too 😀, as every variable access needs to be verified and classified before the interpreter can perform the correct operand on it. And while you can pack large datasets in special containers to be performant, the default Python array object (list) is anything but ideal for big data crunching. Other than that I think I would stick to C/C++ for most of the things if I would abandon LabVIEW.
-
As far as Phar Lap ETS goes you are definitely right. Phar Lap is a dead end and has been in fact since almost 10 years. I'm sure it was one of the driving decisions for NI to push for the NI Linux RT. It's not a coincidence that that got released about one year after Interval Zero declared Phar Lap ETS as EOL. If you want to abandon LabVIEW RT altogether is of course a different question. You can certainly build your own Linux RT kernel, and implement your own automation software platform. I would expect such a project to take only a few man years, not being integrated as much as LabVIEW and being a continuous catch up race with adapting to new hardware since what was the hottest craze last year has already been declared obsolete today. It's a business model you could probably earn some money with, IF you intend to concentrate on doing just that. If it is to support your own automation projects, it will be a continuous struggle to keep it up and running and you will likely always be scratching the corner of barely get it up and running every time, with the plan in the back of your head to make it finally a real released product that you can rely on, but which will never quite happen. IMAQdx for standard Linux is not likely going to happen ever. NI is not really selling image acquisition hardware anymore. They have IMAQdx and IMAQ Vision for their Linux RT targets so that's pretty much all you likely will ever get. There seems to have been a plan for standard Linux hardware drivers both for DAQmx and IMAQdx but with the current situation I'm definitely not holding my breath for it. Also hardware drivers for Linux is a very tricky business, you either open source them completely and get them in the mainstream kernel or you are constantly fighting the Linux Gods and the Distribution Maintainers to keep your drivers running with the latest software and will never be able to make it work even on 50% of the installed Linux distributions out there. Add to that the fact that outside of servers (where automation software with DAQ and IMAQ are extremely seldom of any interests) the Linux market share is pretty small. And the majority of those who do use Linux outside of server environments are more likely to tinker for months with YouTube videos that explain often more than questionable hacks to achieve something, than pay some money for hardware and software other than their PC rig with RGB LEDs and watercooling. For a company like NI that makes this market simply VERY uninteresting as there is simply not much money to be earned but the costs are enormous.
-
I believe this setting should have been by default switched off and still should be. To many problems with it. If someone wants this feature (and is willing to test his application with it thoroughly on several different computers) they always can switch it on. I don't think it is currently worth the hassle.
-
No there isn't such a driver (that I would know off and is openly available). There might be in some closed source environment for a particular project (not LabVIEW related, Pharlap ETS was used in various embedded devices such as robot controllers, etc) that used Pharlap ETS in the past but nothing that is official. There are several problems with supporting this: 1) FTDI uses a proprietary protocol and not the official USB-CDC class profile for their devices and they have not documented it publically. You only can get it under NDA. 2) Pharlap ETS is a special system and requires special drivers written in C and you need the Pharlap ETS SDK in order for this. This was a very expensive software development suite. WAS, because Interval Zero discontinued Pharlap ETS ~2012 and since then only sells licenses to existing customers with an active support contract but doesn't accept new customers for it. Now there is an unofficial (from the point of view from FTDI) Linux Open Source driver for the FTDI standard devices (not every chip provides exactly the same FT232 protocol interface but almost all of the chips that support predominantly the RS-232, 422, or 485 modes do have the same interface) and I have in the past spend some time researching that and trying to implement it on top of VISA-USBRAW. But with the advent of Windows 7 and its requirements to use signed drivers even for a pure INF style driver like the VISA-USBRAW driver, this got pretty much useless. This signing problem doesn't exist on the Pharlap ETS system, but development and debugging there is very impractical so when Interval Zero announced the demise of Pharlap ETS, I considered this project dead too. There was both no easy platform to develop the code on as well as no useful target where this still could be helpful. All major OSes support both the USB-CDC as well as USB-FTDI devices pretty much out of the box nowadays. This includes the NI-cRIO that are based on NI Linux RT. The only two beasts that won't are the Pharlap ETS and VxWorks based NI realtime targets, both of them are in legacy state for years and getting sacked this or next year for good. So while it would be theoretically possible to write such a driver on top of NI-VISA, the effort for that is quite considerable and it's low level tinkering for sure. The cost would be enormous for just the few last Mohicans that still want to use it on an old and obsolete Pharlap ETS or VxWorks cRIO controller. As to if there is a device that can convert your USB-FTDI device back into a real RS-232 device, devices based on the FTDI chip VNC1L-1A can implement this, here is an example as a daughter board. You would have to create a carrier with an additional TTL to RS-232 converter and the according DB-9 connector for this or if you are already busy building a PCB anyhow, just integrate the VNC1L-1A chip directly on it. The most "easy" and cheap solution would be to use something like a Raspberry Pi. That can definitely talk to your FTDI device with minor tinkering of some Linux boot script in the worst case. Then you need an application on the Raspi that connects to that virtual COMM port and acts as proxy between this and an RS-232 port (or a TCP/IP socket) on the Raspi that you then can connect to from your LabVIEW Pharlap ETS program.
-
python and LabVIEW combination work
Rolf Kalbermatter replied to kpaladiya's topic in LabVIEW Community Edition
Correct about the LabVIEW chroot only being an issue on the Linx Targets. But several IPCs will actually work even across chroot jail borders such as good ol TCP/IP (usually my IPC of choice as it works on the same machine, across VMs, on different machines on the same subnet and with some proper setup even across the world). Even shared memory could be made to work across chroot border, but it's complicated, sensitive for correct configuration, and shared memory is difficult to interface to in any case.- 3 replies
-
- imageprocessing
- python
-
(and 1 more)
Tagged with:
-
look for MKS pressure transducer readout PDR 2000 driver
Rolf Kalbermatter replied to fanyf06's topic in Hardware
Sounds like a masterpiece of engineering. Use a non-standard connector, resp. if they use the DB9 on the other device, connect its pins in a different way than standard, and then use for every new device again a different pinout. Someone must think selling custom cables is the way to earn lots of money! 😀 -
Would it be a lot to ask, for this library to support C++ style comments? Basically any text after // would simply be ignored until the EOL. I know that JSON strictly speaking doesn't support comments but JSON5 for instance does support those C++ style comments.
-
look for MKS pressure transducer readout PDR 2000 driver
Rolf Kalbermatter replied to fanyf06's topic in Hardware
According to this link mentioned in the post before yours, you got that wrong. pin 7, 8, 9 are the serial port signals on the DB9 connector. The DB15 connector has them on pin 1, 2, 4! -
Note that LabVIEW does not officially support Windows Server OS. I believe it will generally work, but ActiveX is definitely one of the areas I have my suspicions. Windows Server does quite a bit with security policies to lock the computer down for more security. ActiveX may very well be one area that is very much affected by this. Have you talked with MatLab if they fully support Windows Server editions? In any case, ActiveX needs to be installed and that also means modifications to the registry and the installer can choose if he wants to make the component available system wide or just for the current user. Your component may be such a case, or Windows Server may enforce ActiveX registration on a per user base, or maybe even disallow ActiveX installation into the Admin account without some special command line switch. We don't know and we don't generally use Windows Server. They are usually maintained by IT staff and they tend to be very unhappy about anyone wanting to install anything on "their" systems, so it basically never happens.
-
And how would you suppose should your client, who wants to use this library on a cRIO-9064 (just to name one of the currently still possible options which are Windows 32-bit, Windows 64-bit, Pharlap ETS, VxWorks, Linux 64-bit, MacOS X 64-bit, Linux ARM) recompile it without having access to the key? Sure with asynchronous encryption you don't have to publish your private key but then you are still again in the same boat! If you want to give a client the possibility to recompile your encrypted VIs (in order to not have to create a precompiled VI for each platform and version your clients may want to use), LabVIEW somehow has to be able to access it on their machine. And if LabVIEW can, someone with enough determination can too. Sure enough, nowadays LabVIEW could be turned into LabVIEW 365 and you could host your code on your own compile server and only sell a VI referrer to your clients. Anytime a client wants to recompile your VI, he has to contact your compile server, furnish his license number and the desired compile target and everything is easy peasy, unless of course your compile server has a blackout, your internet provider has a service interruption, or you go out of business. All very "nice advantages" of software as a service, rather than a real physical copy.
-
This very much depends on the used font. Not all fonts specify glyphs for all these attributes. If it doesn't, the attribute request is simply ignored and the font is drawn in the basic form. LabVIEW doesn't draw these texts, it just passes the request to Windows GDI (or Unix XWindows, or MacOS Quartz) and tells it to do its work).
-
Consider the diagram password equivalent to your door lock. Does it prevent a burglar to enter your home if he really absolutely has set his mind on doing so? Of course not! Is it a clear indication to the normal law abiding citizen to not enter? You bet! There is no simple way to protect a diagram that the compiler needs to be able to read in order to recompile the program (for a different version, platform or whatever) without having a fairly easy way to also peek into it for the person truly wanting to. In fact there are many ways to circumvent that. You could patch the executable to skip the password check when trying to open a diagram or you can locate the password hash and reverse its algorithme to get back at the password. The only problem is that this is an MD5 hash. So it is not a simple reversible algorithme, but MD5 is not a secure hash anymore, since with enough CPU power you can find a string (it does not necessarily have to be the original password since multiple arbitrary sized character sequences will eventually map to one single hash code) that results in the specific hash. It may take a few days and will certainly contribute to the global warming 😀, but it can be absolutely done. Chances are that that CPU power may be more productive in terms of money when directed at mining of cryptocurrency, even with the current dive in value. Another approach is to simply replace the password hash in the file with the hash for the empty password (which means unprotected). It's not as simple as replacing 16 bytes of data in a file with a standard byte sequence, since that hash is also computed over some of the binary data of the VI file, but it's also not impossible. Why they didn't make it more secure? The only way to do that would be to truly encrypt it but then you also need the password to be able to recompile the code. But then you can just as well remove the diagram when distributing the VIs, as that diagram has not real additional value anymore, except that you as password owner don't have to maintain two versions, one without diagram to give to your user, and one with it for your maintenance and further development. You would end up with the problem to have to distribute your VIs for every LabVIEW platform you want to support or hand out the password to your users in order for them to be able to compile it for a different platform or version. Basically to come back to our door lock: The security you are expecting would be pretty much similar to replacing all windows and doors in your house with a concrete wall and only leave one single door that is heavily steel enforced, with quadruple high security locks and self shoot installation in case of entering the wrong code more than 3 times. Very secure but absolutely not comfortable or practical and potentially lethal to your family members and friends.
-
As far as if NI could do that legally, yes they could. If they want to? Why should they? Are you releasing all your source code as Open Source? And this is really meant seriously. Propose a meaningful business case why NI should release that source code if you were a manager at NI! And don't forget all the technical support questions of noobies trying to peek in that code, thinking they can fix something and then boom, everything goes haywire. There is no business case in the current NI software model for this. Unless NI decides to go .Net Core with their software, which I don't quite see happen yet. Open Sourcing components that are not just nice to have libraries that you can more or less throw out in the wild according to the motto: Take it as is, improve it on your own or leave it! is only causing additional work and issues without solving anything.
-
If you really want to spend a lot of time and effort into this, the more promising way would be to simply start developing an Advanced Analysis Library replacement directly around the Intel Math Kernel Library. That DLL interface you are talking about is mostly a historical burden carried over from when NI was still developing their own Advanced Analysis Library. Around 7.x days they decided that they could never compete with the girls and guys who knew probably best on this planet how to tickle a few percentage more of performance out of Intel CPUs, because they worked at the company who had developed those CPUs and so NI replaced the in-house developed AAL Math Library with the Intel Math Kernel Library. The DLL interface was left intact so there was little to be changed on the VIs that made up the AAL, but internally most functions in that DLL call more or less directly into the Intel MKL. Spending lots of time for the documentation of that proxy wrapper DLL is not very useful. There are very few secrets in there that you couldn't learn by reading the official documentation for the Intel MKL.
-
Why?
-
No, and there likely never will be! NI has a very strong tendency to never ever talk about what might or might not be, unless the actual release of something is pretty much announced officially. In the old days with regional field sales offices you could sometimes get some more details during a private business lunch with the field sales office manager (usually with the request to please not share it with anyone else, since it wasn't yet official and in fact might never really materialize in that exact way). The only thing we know is that the LabVIEW source code was in some escrow and supposedly still likely is, for the case NI might end up with its belly up. How much that really means, other than to sooth managers who were worried about choosing for a single vendor software solution, I can't really say. It certainly doesn't mean that the LabVIEW source code automatically will be Open Source if NI folds up or decides to shut down LabVIEW development. Selling the rights to it to some other party who declares its intention to somehow continue its development, is almost certainly enough to fulfill all obligations of such an escrow agreement. As Brian (Hooovahh) already mentioned, open sourcing LabVIEW is not an easy task. Aside from old legacy "crimes" committed in the early days of LabVIEW development (and often technically unavoidable since there simply weren't better technologies available, LabVIEW pushing the economically available hardware to its limits, and/or LabVIEW's source code requirements pushing the limits of what C compilers could do back then), there are also things that are likely simply not open sourceable for several legal reason. Reworking the source code to be able to publish it as open source would be costing a significant effort. And who is going to foot that bill, after NI is gone or decided to stop LabVIEW development?? For instance, I'm pretty sure that almost nothing of the FPGA system would be legally open sourceable, since it is probably encumbered with several NDAs between NI and Xilinx. And there are likely several other parts that under the hood are limited in similar ways. Even just the effort to investigate what could and what couldn't be open sourced is going to cost some serious legal and engineering man power and with that also real money. Then someone has to remove anything that was identified as being impossible or undesirable to be open sourced and clean up the mess after that. This in itself would be likely a serious project. And then what? Who is going to maintain a project like that? .Net Core only really is successful because Microsoft puts its might behind it. Without Microsoft's active support role it would be still Mono, hopelessly trying to catch up with whatever new thing Microsoft comes up with.
-
It very much depends how much back you dare to look in that graph! 😀 Since it is a NASDAQ share you should rather go to the source, Luke! That steep climb around 2017 is actually after they started implementing those changes. The decline you see is mostly happening during Covid but as a trend not quite very significant yet. That all said, the current trend to measure everything in share price is a hype that is going to bring us the next big crash in not to much time. My guess is that once most people have crawled out of their covid imposed isolation in their private hole, they will look at the financial markets and wonder where the actual real world value is, that some of the hyped companies would need to have, to make their share price expectations even remotely true. And then the big awakening happens when the first people start to yell "but he isn't wearing any clothes".
-
I don't have these numbers. What I know is that a few years ago, NI noticed that their sales figures were starting to flatten. For a company used to have high two digit growth numbers year after year this is of course a very alarming signal. 😀 They hired some consultants who came to the conclusion that the traditional T&M market NI was operating in simply didn't have left much more air in it to continue to support the growth NI had been getting used to. And strategic decisions were made behind the scene. Not to much of that has been openly communicated yet, but the effects have been quite obvious in the past few years. NI has deprioritized the PC based test and measurement hardware, completely abandoned any motion ambitions, marginalized their vision ambitions and put much of the traditional DAQ hardware into legacy mode. And their whole sales organization has been completely revamped. No field sales offices anymore, highly centralized technical support by typical call center style semi-outsourced places. Behind the scene they do large scale business and have increased their sales further since that alarming consultancy report. So somehow it seems to work. One reason they may not have been very public about these changes is probably that it did change their old model of relying very heavily on external Alliance Members for actual application support of all their customers. In a few strategic industries they now have moved in to deliver full turn key systems themselves directly to the customer. For the typical Alliance Member that probably doesn't directly mean loss of business, since the customers and projects NI serves in this way are accounts that only very few Alliance Members would dare to even consider to look at, as the volume of the business transaction is simply very huge. However it certainly has other effects for all Alliance Members. The contact with NI has been getting very indirect with all the regional sales offices having vanished and the efforts from NI to compensate that with other means haven't gotten much further than marketing presentations with lots of nice talk up to this point. As to LabVIEW: It's demise has been promised since it was first presented. First because it was clearly just a toy that no engineer ever could take seriously, then because NI didn't push for an international standard to formalize the LabVIEW G language, later because they didn't want to open source it. I'm not sure any of these things would have made a significant difference in either the positive or negative direction. It's clear that LabVIEW is nowadays a small division inside NI that may or may not find enough funding by the powers to be, to maintain a steady development. If you look at the software track record of NI it doesn't look to well. They bought many companies and products such as HIQ, Lookout with Georgetown Systems, DasyLab, and quite a few more, and none of them really exists nowaday. Of course a lot of them such as DasyLab were in fact simply buying out competition and it was clear from the start that this product has not a very bright future in the NI stall. Lookout was marketed for quite some time, a lot of its technology integrated into LabVIEW (LabVIEW DSC was directly build on top of much of Lookouts low level technology and the basis for the low level protocols used in Shared Variables and similar things). LabWindows/CVI is lingering a semi-stasis existence for several years already. It's development can't quite keep pace with the main contenders in the market, GCC and Visual Studio. In view of this, acquisitions like Digilent and MCC may look a bit surprising. On the other hand it might be a path to something like a HP/Agilent/Keysight diversification. NI itself moves into the big turn key semiconductor testing business (and EV testing market), one of these other companies takes over the PC based LabVIEW, DAQ, and instrument control business.
-
That's always debatable. From a technical point of view I fully agree with you. LabVIEW is a very interesting tool that could do many more things if it had been managed differently (and also a few less than it does nowadays). For instance I very much doubt it would have ever gotten to the technical level it is nowadays if a non-profit organization had been behind LabVIEW. The community for LabVIEW is simply to diverse. The few highly technical skilled people in LabVIEW world with a very strong software engineering background, who could drive development of such a project in an Open Source model, do not reach critical mass to sustain its continued improvement. On the other end of the scale you have a huge group who want to use LabVIEW because there is "no programming involved", to parodize some NI marketing speak a bit. Maybe just maybe, an organization like CERN could have stepped in just as what happened with KiCAD. KiCAD lingered for a long time as a geeky Open Source project with great people working on it in the typical chaotic Open Source way. Only when an organization like CERN put its might behind it, did the project slowly move into a direction where it could actually start to compete on features and stability with other packages like Eagle PCB. It also brought in some focus. CERN is (or at least has been) quite a big user of LabVIEW so it could have happened. CAD development moved in the meantime too, and while KiCAD nowadays beats every CAD package that was out there 20 years ago hands down, the current commercial CAD platforms offer a level of integration and highly specialized engineering tools, that require a lot of manual work when tried in KiCAD. Still, you can design very complex PCBs in KiCAD nowadays that would have been simply impossible to do in any CAD package 20 years ago, no matter how much money you could have thrown at it back then. But LabVIEW almost certainly would not cross compile to FPGA nowadays, and there would be no cRIO hardware and similar things to which it almost seamlessly compiles to, if it had not been for NI. On the other hand, LabVIEW might actually be a common teaching course at schools, much like Python is nowadays on the ubiquitous Arduino hardware, if NI had decided that they want to embrace LabVIEW being a truly open platform. The reality is, that we do live in a capitalistic system, and that the yearly earnings is one of the highest valued indicators for success or failure of every product and company. Could LabVIEW have been and being managed differently? Of course! Could it have survived and sustained a steady and successful development that way? Maybe!
-
There is a standard digital signal available in the FPGA environment that allows resetting the device. You can assert this pin from your FPGA program. So one way would be to add to your FPGA program a small loop that polls the external digital signal (preferably with some filtering to avoid spurious resets) and then feed that signal to the FPGA Reset boolean signal.
-
NI didn't say they would be porting NXG features to 2021, but to future versions of LabVIEW. Technically such a promise would have been unfulfillable, since at the time the NXG demise was announced, LabVIEW 2021 was basically in a state where anything that was to be included in 2021 had to be more or less fully finished and tested. A release of a product like LabVIEW is not like your typical LabVIEW project where you might make last minute changes to the program while testing your application at the customer side. For a software package like LabVIEW, there is a complete code freeze except for breaking bug fixes, then there is a testing, packaging and testing again cycle for the Beta Release, which typically takes a month or two alone, then the Beta phase of about 3 to 4 months and finally the release. So about 6 months before the projected release date, anything that is not considered ready for prime time is simply not included in the product, or sometimes hidden behind an undocumented ini file setting. Considering that, the expectation to see any significant NXG features in LabVIEW 2021 was simply blue eyed and irrational. I agree with you that LabVIEW is a unique programming environment that has some features that are simply unmatched by anything else. And there are areas where its age is clearly showing such as lack of proper Unicode support, and related to that the lack of support for long path names. Personally I feel like I could tackle the lower level part of full Unicode support in LabVIEW including full Unicode path support quite easily if I was part of the development team, but have to admit that the higher level integration into front panels and various interfaces is a very daunting task that I have no idea how I would solve it. Still, reworking the lower level string and path management in LabVIEW to fully support Unicode would be a first and fundamental step to allow the other task of making this available to the UI in a later stage. This low level manager can exist in LabVIEW even if the UI and higher level parts don't yet make use of it. The opposite is not possible. That is just one of many things that need some serious investment to make the whole LabVIEW platform again viable for further development into the future. This example also shows that some of the work needed to port NXG features back to LabVIEW require first some significant effort that will not immediately be visible in a new LabVIEW version. While such a change as described above is definitely possible to do within a few months, the whole task of making whole LabVIEW fully Unicode capable without breaking fundamental backwards compatibility, is definitely something that will take more than one LabVIEW version to eventually fully materialize. There are a few lower hanging fruits that can help prepare for that and should have been done years ago already but were discarded as "being already fixed in NXG" but the full functionality just for full Unicode support in LabVIEW is going to be a herculean task to pull off, without going the path of NXG to reinvent LabVIEW from scratch (which eventually proved to be an unreachable feat). My personal feelings about the future of LabVIEW are mixed. Not so much because LabVIEW couldn't have a future but because of the path NI as a company is heading. They have been changing over the last few years considerably, from an engineering driven to a management driven company. While in the past, engineers had some real say in what NI was going to do, nowadays it's mostly managers who see Excel charts, sale numbers and the stock market exchange as the main decision making thing for NI. Anything else has to be subordinated to the bigger picture of a guaranteed minimum yearly growth percentage and stock price. The traditional Test & Measurement market NI has served for much of its existence is not able to support those growth numbers anymore. So they are making heavy inroads into different markets and seem to consider the traditional T&M market by now just as a legacy rather than a significant contributor to their bottom line.
-
Can Queues be accessed through CIN?
Rolf Kalbermatter replied to Taylorh140's topic in Calling External Code
Well, ultimately everything LabVIEW does is written in C(++). Some (a very small part of it) is exported to be accessible from external code. Most goes a very different and more direct way to calling the actual C code functions. Functions don't need to be exported from LabVIEW in order to be available for build in nodes to be called. That can all happen much more directly than through a (platform depending) export table.
