-
Posts
446 -
Joined
-
Last visited
-
Days Won
28
Content Type
Profiles
Forums
Downloads
Gallery
Everything posted by Mads
-
Nikita, have you signed up for the 2.0 beta? Having cooled off a bit about the decisions NI has made with NXG, I'm spending more time in testing it (even though it is not supposed to cover many our needs for many years to come I do not want to have to start from scratch then), and providing feedback to NI (I'll probably fill up their in-box ) in forums and through the in-built (did not find it at first, it's the talking bubble at the top right) feedback function.
-
Apparently the CG term is not supposed to be used. I would personally prefer to have an explicit way of referring to the...non-NXG versions, but the official names are LabVIEW and LabVIEW NXG.
-
I mentioned workarounds in the post, and this in one of them. A bad one. You end up wasting way too much real-estate on this. Here's one for the NXG idea exchange; make a compact version of the IDE surround the panel/diagram (even make this part optional), and let larger items like the palettes magically appear at the cursor with a mouse-click ...oh, wait...). Not often. And NXG is like shooting yourself in the foot then, to get rid of a mosquito. None of these require NXG to be introduced. NXG would be a great new NI MAX, and some of its functionality would be great to have integrated into LabVIEW 2017/18 too. It's the whole other list of unnecessary changes and OK changes but released in an infantile NXG. I'm sure Shaun is spinning in his chair...but let me chime in here as well: Allow me to put the LabVIEW RT environment and run the same code as I use on Windows desktops (which is what we do today, thanks LabVIEW CG!) on a low power, low cost Linux SBC with plenty of serial IO and dual Ethernet (no such thing from NI other than the SOM, and then only if you design your own carrier board(!)) and I'll grab that opportunity over cRIOs faster than you can say NXG. Today we actually rip out the inside of cFP-2220's and put them in subsea instruments, just because that's the best option available unless we move away from LabVIEW.
-
I saw NXG through the tech preview, and there were a few of us that protested in the forums there.My hope was that it was mostly just experimentation,, not something they would release as it was. You have an optimistic view on things in the illustration. This time around I do not think it is just a question of users not wanting to change though. Every company making large changes can tell themselves that , and choose to insist on their planned direction, but quite often its just a bad product. There are positives with NXG (if it was the next generation of NI Max for example), but they do not justify and cannot outweigh the negatives (when supposed to be the next LabVIEW). This is the first time a release from NI has gotten me more interested in the competition than in the new NI products.
-
The MDI/tabbed interface solution for VIs seems to be one of the most fundamental flaws of NXG GUI. One of the biggest strengths of LabVIEW CG is that it enables and encourages continuous testing. You can have the user interface of your VIS (the whole application you are building for that matter) *on screen*, shown as it will be when built, not as a page within an MDI interface) and provide input to it and view the output - while you at the same time view and change the code of various VIs...tweak, run (as soon as the code is not broken), input, view output, tweak, run...etc. Just the idea that it is OK to require the developer to manually tab between the front panel and the code is ridiculous. It is scary how they can think that's a good change in the work flow, but then again - the text programmers are not used to much of what is (was) great in LabVIEW so that might explain why such things seem expendable. Now you can say that there are ways around this , or that it can be *fixed* in future versions of NXG - but the problem with that is that it would also require a complete redesign of the rest of NXG. There is too much stuff in NXG relying on it. Do not get me wrong here - I would love it if someone told me wrong and showed me the NXG light. Where are all the NXG evangelists? What do the Knights of NI for example really think about NXG and the road ahead? Are they worried or even angry too, or do they think this is the best thing since sliced bread?
-
That's a pretty good description of how it feels to me too. A fancy interactive configuration tool, first and foremost. We're back to the "no programming", NI Hardware-centric marketing dream, and not the fun graphical cross-platform programming environment that can be used for general application development. Windows Metro for the desktop comes to mind too... One of the first things that bugs me when working in NXG 1.0, and unfortunately 2.0 as well, is the whole concept of having everything in one window. It's claustrophobic, and feels like I'm stuck in - you said it, NI MAX...I want to break out those front panels, view them as I would want them in my built application throw some of the block diagrams to another screen to do some debugging while interacting with the front panel at the same time. Get rid of all the overhead of the surrounding interface etc...I can see how you can make multiple complete IDE tabbed windows, but where is the WYSIWYG for built applications in that? Previously I considered what NI did with G/LabVIEW to be similar to what Apple did with MacOS; they created a revolutionary environment that made it much more intuitive and fun to use computers/develop applications (and just like with the Mac, only the smartest people understood that this was the future ). It did mean that there were limits as to what you could achieve...and we've all been spending lots of time trying to work around those for years...(hoping to get better native graph controls and a few thousand other things) but there was enough functionality and flexibility there to make the joys of graphical programming worth it. The last couple of years I've found myself frustrated by the fact that things seemed to move backwards/away from that philosophy with things like Linux RT; gone were the days when everything was available wrapped in a nice graphical interface; I suddenly found myself writing bash (!)-scripts to get even quite basic stuff done. When I saw the demo of the web functionality in NXG 2.0 I got some of the same feeling; In the good old days (yeah yeah) I would not have expected it to be seen it as a positive that the HTML-code was just around the corner....to me the whole concept of LabVIEW is to provide a 100% graphical editor, it should allow you to do 99% of what you want to do *graphically*. The HTML should be accessible too, sure, but not "in your face". Has the text programmers behind LabVIEW gotten too much influence? Thinking that LabVIEW is really just for non-programmers (so you really need to make it a configuration tool instead), and if you want to do some real programming you should work like they do - with text, and an IDE as close to the ones they are used to as you can get? Oh well, perhaps I should cool off ....and force myself to test it more, or maybe not (so far that has made me angry pretty quickly).
-
Sure, trying to get feedback from people on something in beta already that has already been changed as dramatic as NXG is rather unfruitful...it's too much too late. Especially if most of the users that might have feedback for it stay away because of previously signed NDAs. We could always dream of access and influence at every stage, and hope that our personal views won the battles, but I do not think that would be productive. In general revealing too much about future products is bad for business (in this case for both customers and NI). If the news is about future *updates*, you have a positive effect though. Then people know that they will gradually get more and more out of their investment, they will not need to throw the old cell phone in the bin and buy a new one to get access to a new feature.You could say that because NXG is given to existing owners of LabVIEW CG (not sold as a separate product, that would have been terrible) it could be consideres an update, but it's more like halting the development of the software on your existing cell phone, and giving you a new one with some new features...but unfortunately you cannot use it to phone anyone for the next couple of years (use it on your cRIO projects for example). So now you have access to some new features, but you have to carry around two phones...
-
Did you just read my mind? Perhaps I was reading yours <Old man (well, middle aged) yelling at the sky> We have been waiting for major upgrades of LabVIEW for years, and after so many years without much progress it turns out NI has really just abandoned the ship to build a different one, not sail-worthy until 2020. Where does that leave LabVIEW TG (This Generation) but dead in the water? The road map does not exactly encourage us to base our business on it. We either spend time and money on staying updated on the next generation stuff for many years until it actually can replace this generation, or move away from NI. Frankly I would have preferred it if they kept us in the dark, working on NextGen for another 4 years until it had reached "parity", and only *then* told the world about it. </Old man yelling at the sky>
-
I'm mostly worried about what this means for the regular LabVIEW and its users. Does it leave us with a half-dead LabVIEW alongside an incomplete (and unfortunately in many respects less user-friendly) NXG for many years to come? Do we have to choose between an old-fashioned/outdated parent, and an infantile NXG? Eventually getting forced to the NXG due to the age of the regular version? I was really hoping that they could transform the underlying technology in a few major jumps, but avoid alienating the current users by a) keeping the functionality (hardware support etc) and b) changing the GUI more gradually. Or just make a clean cut. As it is now I'm afraid we might get a division between the large user base which really needs the functionality only supported by LabVIEW 2017 (or earlier) for many years to come, and a smaller next generation of users which will adopt the new user interface easier as they do not have experience and investments in the regular LabVIEW already, but which will also limit itself due to the lack of hardware support etc. in NXG. Perhaps someone attending NI Week can ease my worries (or reaffirm them)?
-
NI Week is just around the corner so I was wondering when the LabVIEW 2017 installers would be published...and found out they are already out Here are two of the downloaders: ftp://anonymous@ftp.ni.com/evaluation/labview/ekit/other/downloader/2017LV-WinEng_downloader.exe ftp://anonymous@ftp.ni.com/evaluation/labview/ekit/other/downloader/2017RealTime-Eng_downloader.exe
-
We use Profibus to Ethernet gateways for this....Makes it possible to use it across the network from a variety of hosts. Here is the one we have used, but there are others: https://www.kunbus.com/fnl-gateway-profibus.html or
-
We use NI's OPC UA server-functionality (currently part of NI RT and DSC) and it works fine with other OPC clients. I would have been extremely surprised if it did not, after all much of the point of using industry standard protocols is to be able to exchange data between parties from different vendors.... Prior to the OPC UA API in LabVIEW we used Kepware when we needed to be an OPC DA server, and Datasocket when we could operate as a client. As people have mentioned here before using Datasocket has its weaknesses, but we avoid some of them by implementing it as a service (no GUI to interrupt the UI thread...). So the datasocket OPC-bit is in a service, which then has a proprietary TCP/IP based server-protocol to be used by a client application (running in the system tray) that acts as its GUI.
-
Debugging with Desktop Execution Trace Toolkit
Mads replied to Omar Mussa's topic in LabVIEW General
You *can* use DETT on built applications, just enable VI server access and debugging in the build. Here is a description of it: http://digital.ni.com/public.nsf/allkb/A50A8BBFD737679186257D7700688757 -
Interviewing for a LabVIEW developer position, how should I prepare?
Mads replied to rakuz's topic in LabVIEW General
Absolutely. The devil is in the details when answering such a question. An answer like the one you describe could sound sincere and OK if expressed in the right way, but most often it would come out as fake - which in turn tells the interviewers plenty. That is why this is such a good question. It is a multi-layered test. Even if I believed that answer, there are in fact consequences to it that are not just positive. Superficially people might think it is the "perfect" answer, but I might for example think that you will not be able to compromise when needed, which can be quite often in real life. So, never answer questions the way you think is expected of you, you have to be honest. You should be a bit tactical of course, but I would say a good interviewer (one from a company you would really want to work for, an interview goes both ways too!) will prefer that you are more honest than tactical... -
Interviewing for a LabVIEW developer position, how should I prepare?
Mads replied to rakuz's topic in LabVIEW General
An interview is done to discriminate Kidding aside: If the person comes from a different culture where limp hand shakes are the norm, then two things will come into play: First off I will probably have read up on his cultural background (if I am not familiar with it already) up front and/or take my lack of familiarity into account. Secondly I will still expect the person to have done his research too, and know to have a firm hand shake. The hand shake is never a deal braker, do not get me wrong here, it is just one of the many social details that I think you should expect interviewers to be noticing. -
Interviewing for a LabVIEW developer position, how should I prepare?
Mads replied to rakuz's topic in LabVIEW General
I've hired all the engineers in my department personally and for me once you get to the interview, it is 80% about your personality. Sure, on the surface I'll ask quite a few questions about your programming experience and philosophy, and if it is obvious that you do not know what you are talking about (especially if you do not seem aware of that fact either), you will not be hired. The real clincher is about the person's attitude and general behaviour though. If you seem bored(!) or out of focus and energy during the interview, have a limp hand shake (the fact that you do not know that it is a negative signal is the most serious offence), or are unable to present previous projects with enthusiasm and clarity, it is difficult to trust you, even if you can demonstrate brilliant coding. Preparing for standard interview questions like "what is your greatest weakness" is a must (do not say that you have none for example, that just demonstrates lack of self awareness(!)). I'm sure there are lots of cultural differences though. In Norway, where I am located, we probably pay less attention to formal training than many other places, and have a tendency to expect people to be humble for example. -
That's true, if you have drivers that do not give you access to work with the generated messages, but requires you to allow them direct access to the shared resource, your solution makes sure you can still easily control the medium access. In our channel handlers we also implement a device and/or channel reservation mode - which could solve the same challenge. The device wrapper would then request a channel reservation, prior to allowing the third party driver to access the channel in it smore direct manner...(We use the device reservation function when we need to ensure that no other entity is interfering with a complex operation towards one device on the channel).
-
The Plasmionique library is very nice, I am advocating it internally as an example of good design :-) When it comes to sharing of a communication channel (serial port, TCP-link etc) I've always attacked it by having channel handlers. For each channel I instantiate a channel handler of the correct type, and every need to access the channel (different device instances for example, or communication routed into the system from an external source) do so by communicating with the appropriate channel handler (normally using a SEQ). One upside to this centralized approach is that it is easy to see the whole picture of what is going on on a channel. The handlers normally run in the background, but they have their own user interface with communication stats etc available for debugging purposes, and/or they are DQMH-modules and all their actions are subscribable user events... :-) )
-
Thanks so much for the videos, you are doing everyone a huge favor.
-
I assume you are thinking about the static access here (where you always will be requesting the same value so that you can skip the lookup after the first call)? With no lookup you get into a different league of course. For random access there is no break-down; it will perform faster or comparable to the alternatives I guess you are thinking of. Personally I see more value in dictionaries in the random access scenario, so that's probably why I do not have the same focus on that bit as you.
-
I've run some preliminary testing of 2016 and the Variant attribute IPE for key value pair lookups now, and have compared it with 2016and 2015 without the IPE: Without the IPE, 2016 is equal in speed or negligibly slower than 2015 (so there is no instantly free lunch in just upgrading to 2016). With the IPE the speed of dictionary read and write operations are considerably faster, on my machine they were 1.7x faster than without/in LV2015. This was with only an array index in the attribute (value of key-value pair stored in separate array). I also did a quick test where I stored a DBL directly in the attribute, which turned out to be even faster (1.4x) for writes, and equal for reads. That's probably not the case for more complex data types, but the gap will definitely be smaller than before. The CVT for example then will in most cases be better off using attributes to store the actual values instead of keeping separate arrays for them. That would also allow it to be made more flexible when it comes to adding or removing tags.
-
The Current Value Table is one tool that used to have the actual values of its key-value pairs stored as attributes, but ended up only storing type specific array indexes in the attribute to (vastly) improve its performance. I wonder how much of that difference (if any) could be removed with the new IPE feature... Has anyone checked that already? I've been evaluating different dictionary solutions lately and this might change the picture slightly. I'll download 2016 and do some testing later this week.
-
Can VIPB files still be hand-modified?
Mads replied to JKSH's topic in Application Builder, Installers and code distribution
Editing the file like that causes the file to be invalid as the cryptographic hash in the file is no longer correct (introduced in 3.0). So unless you know how to update the hash (stored in the file ID probably) correctly you will not be able to produce a valid update. The Pro version might allow you to open them anyhow just to have a look at it to see what has been going on, but I would think that it should still at least give the users a warning that the file has been tampered with(?). -
messenger library Instructional videos on YouTube
Mads replied to drjdpowell's topic in Application Design & Architecture
What version of OpenG Zip library did you try to install (it has to be done using the Custom Software Install feature in NI MAX by the way...)? I'm using it on NI Linux RT targets (cRIO-9030 and sbRIO-9651) without any problems (thanks a million Rolf Kalbermatter). Here is a thread where my initial request for Linux RT support was discussed. -
When you say that you want to add the first three rows and add the elements together I assume that the sum you are after is not the sum of all the elements in those rows (which would be just one number), but the sum of each subcolumn, or? If so, in your example above the first row in the output array will be: 10, 13, 13, 57, 21, 11 There are a lot of ways to do that, and the different ways have different speed and memory efficiency depending on how many columns there are, and how many rows you want to sum. Attached is one alternative (made in LV2015, back-converted to 2012) that seems to be an OK compromise between code simplicity and speed...I'm sure someone will take the challenge and improve it though ArrayNthRowSum.vi