Jump to content

ShaunR

Members
  • Posts

    4,871
  • Joined

  • Days Won

    296

Everything posted by ShaunR

  1. I have had situations like this and the only option was to look at the XML. In there you may find absolute paths (c:\myapp\myprogram) mixed with relative paths (..\myprogram). Change all the absolute paths to relative ones and it may resolve this.
  2. I've no idea what this means Well. It's a bit off topic but........ I used them for visual inspection. The RPi3 has a CSI-2 interface (many of these boards do). I think it was 4 lane (4Gbps) but I only needed two. You get wireless and HDMI for free. If you need a bit more processing power then you can add something like a Saturn FPGA for image processing (LX45, the same as in some of the cRIOs). The customer has tasked one of their engineers to investigate creating a GigE to CIF-2 converter because they eventually want to use and reuse Basler cameras on other projects but we were unable to find such a device (links please if anyone has them). They already had a LabVIEW program that they used a bit like NXG and I connected that to the sub system with wss Websockets. For the ultra low end devices, I use them to add data logging, HMI and watchdog capabilities to existing hardware (like the Wago that was linked earlier and cRIO). You only need an Ethernet port and HDMI for that and as a bonus you get more USB ports. There is a whole multitude of choices here - with or without Wifi, LORA, more or less USB ports and GPIO etc. The only lacking technology at the moment is availability of GB Ethernet but they are starting to come through now. These types of devices I no longer think of as "hobby" devices.
  3. Yeah, No. I've never seen a patronising waveform before . That's not happening this time and probably hasn't happened for many years. We've seen LabVIEW stagnate with placebo releases and there are so many maker boards for less than $100, hell, less than $10, it's no longer funny (and you don't need $4000 worth of software to program them). I while ago I put a couple of Real-time Raspberry PIs in system instead of NI products. You love Arduino, right? There is a shift change in the availability of devices and the tools to program them and it has been hard to justify LabVIEW for a while now except if a company was already heavily locked in. If it's not working on Linux then that's a serious hobbling of the software since I, like many others, am now looking to expunge Windows due to privacy concerns for myself and my clients.
  4. The biggest problem for me was that they have changed all the wires. I spent about 10 minutes trying to figure out why a for loop wasn't self-indexing because a normal scaler value looks like a 1D array wire. At one point I was sure that it should've and was cursing that they had removed auto-indexing Then I made the scaler into an array (which was a lot harder than it should have been) and it looked like a 2D array wire . I was probing around with the wiring tool tool trying to figure out what the wires actually were because my minds auto-parser just said everything wouldn't work. Add to that that all the primitive icons now look the same to me, only distinguished by some sort of braille dot encoding, I thought "I'm glad I won't ever have to use this". I think NI have recreated HPVEE to replace LabVIEW
  5. If you think of it more as a replacement for MAX with a VI editor built in, then it may make a lot more sense than a new LabVIEW. There is a very obvious emphasis on distribution, system configuration and deployment-many of the things that MQTT is doing for maker boards and IoT devices. I don't think it is a coincidence that wireless are among the first products to be supported rather than cRIO type devices. Interestingly I also think it is a case of be careful what you wish for. I have asked for many years for source code control applicable to LabVIEW because other SCC treat Vis as binary blobs (good luck with that merge ) So really we could only use them for backup and restore. Well. Now LabVIEW Vis are text (XML it seems) so it is absolutely possible to, in theory, do diff merges. I also have complained about creating custom controls. Well. We will have that now too via C#. So I can write IDE controls and project plugins in C#. I can bind to the NI assemblies (in C#) for DAQ and Instrument Control. I can use SVN with C# so what do I need LabVIEW for? If you *want* to use LabVIEW then there is an IDE, although rehashing all the icons will only alienate existing LabVIEW programmers. We will be inferior to C# ones because we can only do a subset of their capabilities on our own platform. Over time, NI customers will not renew their LabVIEW SSP packages which gives NI the excuse to not support it. All the experienced LabVIEW programmers will either move to something else because no jobs are available (see my previous comment about Python in T&M) or retire. Large corporations will be hiring C# programmers to rewrite the LabVIEW code that failed conversion. Then ..... it gets worse The best we can ask for at this point is that NI open source LabVIEW 2017 and give it to the community as a going away present :.
  6. I don't blame the language. C# is just a front end for .Net (like VB.net et. al.) It's the .Net I blame and have always blamed and I don't care what language you put in front of it including LabVIEW - It will be worse. But there is more to this than a language choice. It is quite clear than NI have jumped in with both feet onto the Windows 10 platform which, don't forget, is attempting to corner all hardware platforms with one OS. There is some sense in that since Win 10 has JIT compilation of .NET (perfect for VI piecemeal compilation) , a background torrent system (push deployment and updates whether you like it or not) and plenty of ways to get data for mining and DRM enforcement (Customer Improvement Programs). This is the start of Software As a Service for NI so you won't need LabVIEW in its old form.
  7. Well. LabVIEW was the best cross development platform. That is why I used it. I could develop on Windows and, if I managed to get it installed, it would just work on Linux (those flavours supported). Over time I had to write other cross platform code that LabVIEW couldn't do or, rather, wrote cross platform code so that LabVIEW could use it on other platforms. I used the Codeblocks IDE (GCC and MingW compilers) for C and C++ DLLs or the Lazarus IDE for Free Pascal (in the form of Codetyphon). The latter can use several GUI libraries such as QT, GTK through to Windows or its own fpGUI. However. those things aren't the problem with Linux. It is the distribution and the Linux community is in denial (still) that there is a problem. So recently I have had a pet project which was a "LabVIEW Linux" distribution. I created my own distro where I could ensure that no other idiot could break the OS just because they decided to be "cool" or wanted to use "this favourite tool". I basically froze a long term support and tested distro, added all the tools I needed, removed all the "choice" of what you could install then made it into a custom distro that I could micro-manage updates with (like OpenSSL), It had LabVIEW, VISA and DAQ installed, along with a LAMP, and all my favourite LabVIEW tool-kits were pre-installed. It could be run as a live CD and could be deployed to bare bones PCs, certain embedded platforms and even VPS. I solved (or added to )the distribution problem by making my own. Here's a funny anecdote. I once knew of a very, very large defence company that decided to rewrite all their code in C#. They tried to transition from C++ on multiple platforms to C# but after 6 years of re-architecting and re-writing they canned the project because "it wasn't performant" and went back to their old codebase.
  8. Silverlight was definitely the stake through the heart of Web UI bulder just as (I think) C# will be the same for LabVIEW. I've been actively moving over to Linux for a while now and .Net has been banned from my projects for donkeys' years. So doubling down with a Windows only .NET IDE is a bit perplexing. Especially since at this point, I consider Windows pretty much a legacy platform for T&M, IoT, DAQ and pretty much everything else LabVIEW is great for. When Windows 7 is finally grandfathered; Windows will no longer be on my radar at all.
  9. What are these long-term plans? I did manage to get it working on another PC and whilst there were some nice IDE features (tabbed panes with split) I couldn't use it for proper work. it doesn't address any of the long standing features we have asked for, it is quite slow in responsiveness. Graphs are very basic (no antialiasing? No cursors?) and, I believe, it is Windows only (was that C# I saw in its guts?). If they had added the MDI style of UI to LabVIEW 2017 I would have been over the moon! As regards your comment about relevancy. Is it similar to Test and Measurement where, in the last year, Python has overtaken LabVIEW as the language of choice? (at least in the UK, that is)
  10. Maybe. But is there now a proper Source Code Control? Can we make our own native controls? Or are we looking at just another BridgeVIEW-like environment?
  11. Well. I was pleased to see that they finally created a NI package manager-I wasn't impressed when they opted for a 3rd party company to provide one (and, of course, said so). That's about as far as I got, though. .LabVIEW 2017 installed fine!
  12. This is why I prefer the "TARGET_TYPE" which doesn't have this limitation.
  13. I highly recommend you use the native property nodes which are guaranteed to work on all supported platforms. The "Execute" is a last resort and I don't see anything in your code which cannot be obtained with property nodes.
  14. AQ gets quite irate when people talk about LabVIEWs "garbage collector". I will defer to his expertise and definition Just to get in before someone pipes up about THAT function......."Request Deallocation" is not a garbage collector in any sense. "
  15. Viewing is easy. Just Save a FP image to a file and reference it in a webservers HTML page (NI Webserver, Apache NGINX,whatever.) Control is much harder which is why many of us use Websockets. The NI webserver is based around calling individual functions (Vis) in a command/response manner from the browser which is why they have Remote Panels as Smithd has mentioned. There are indirect methods like VNC or surrogate "SendKeys" software but direct control of a LabVIEW application requires you to embed a server in your application which can respond to browser clicks
  16. LabVIEW doesn't have a garbage collector. I've seen these sorts of behaviours with race conditions when creating and freeing resources.
  17. I posted an example a little while back of using a SQLite DB for this sort of thing which was based around testing rather than pure DAQ so it is a superset of what you require (at the moment ). Code is here.
  18. This can be done in a VIM because the comparison primitives' adapt to type so all that is needed is for the controls and indicators on the VI to do the same. Do you have another example? That's because variants are run-time evaluated even though we have to stipulate how to deference them with the Variant To Data primitive. Wouldn't a "To Variant" that could be configured much the same way control refs have the option to "Include Data" to make them "Strict" be more intuitive and easier? After all. Variants do contain the type already, it just doesn't get propagated at design time. The bonus would be we wouldn't need the "Variant To Data" 99% of the time at all.
  19. Not sure what you are trying to achieve here. The "List" is an example of a class that can be used with different type data (that's what variants are for) and VIMs give you a simple method to "adapt to type".
  20. Indeed. My conversation was in replying to Shoneill. The OP doesn't have a problem with the producer being faster than the consumer.
  21. The use of TCP/IP is because it is acknowledged and ordered. Don't forget this is for when the producer is faster than the consumer - an unfortunate edge case. No. The client is effectively DOSing the server (causing the disconnects). TCPIP already has a mechanism to acknowledge data and even retries if packets are lost. This is just using the designed features to rate limit. The receiver side can have as much buffer as it likes. There is no need to "match" each endpoint. We just want to rate limit the send/write so as not to overwhelm the receiver (I'm not going to use client/server terminology here because that is just confusing) As I said earlier. They don't have to be matched. If you are really worried about it you can modify the buffer size on-the-fly. You are getting bogged down on being able to set a buffer to exactly the message size. It doesn't have to be that exact, only enough that the receiver doesn't get overwhelmed with backlog and occasional room to breathe. It's simple, fast, reliable and far more bandwidth efficient than handling at Layer7.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.