Jump to content

hooovahh

Moderators
  • Posts

    3,365
  • Joined

  • Last visited

  • Days Won

    268

Posts posted by hooovahh

  1. XLR8 doesn't depend on having Excel installed at all to work, but it has some major limitations over Excel.  Last I knew graphs and charts weren't supported at all.  It really is just a way to write and read cell data not too much on the fancy formatting side.

  2. So I didn't know where to post this but I thought some might find this interesting.  Someone had a HP 34401 bench top multimeter that was working fine, but had a display that was fading and was hard to read.  So they probed around the display communication, saw that it was SPI, and then made a device that received the SPI data and put it on a new OLED display.  They then fitted all of this inside the old unit so it looks as good as new.

    https://hackaday.com/2018/08/29/faded-beauty-dmm-gets-an-oled-makeover/

    It makes me want to try to find some scrap hardware on ebay and see if I can repair it.  Of course I don't really need another random piece of half broken equipment in my office.

    • Like 1
  3. I've heard the NI argument against older issues -- there may be internal or customer-specific information, but I don't see that as an excuse moving forward.

    I haven't heard that argument but I think it is a good one.  Often times I will see some odd behavior in a larger application, and then start pruning it until it is small enough to send to NI.  It often has a weird unpredictable behavior, and contains some amount of the original application.  I will send the minimized program to NI with instructions on how to reproduce it and what the observed behavior is.  From my point of view I can't say "Memory allocation when using DVRs in a reentrant inlined VI set to subroutine, causes a race condition with an IPE structure if the moon is out."  I just have weird behavior and send it to NI.  They have a bug tracking system but I assume my whole project goes along with the issue and my instructions.  I'm not sure how easily NI could expose the issue to the world to vote one, or view, without including my application.  And even if they did it might not always be clear that an issue I'm having is the same as someone else's.  I think a publicly tracked issue tracker is a good idea, and I'd like having it, I'm just saying I can see why NI doesn't do it.  Issues with JIRA, and a web browser won't contain 5,000 individual files of source code to reproduce the issue.

  4. 2 hours ago, ShaunR said:

    There are two error practices that may be being conflated from what you describe and what smithd is concerned with (and Darren describes in his talk) . Having a pass-through error (error in/out with no affecting code) and having a case structure around code with a switch on error; one case of which passes through.

    Yes I was referring to smithd stating:

    Quote

    Theres also the argument that you should just get rid of any error wire that has no immediate purpose.

    I was saying one reason (again not a great one but one nonetheless) is that if you do wire up errors with no immediate purpose, it does allow for checking the timing between nodes easier by using the error timing probe.

  5. So I think part of the reason I hear people say the same thing of "Does anyone at NI even use LabVIEW for real projects?" is because every decently sized project I've worked on has these same types of issues.  And it sounds like other users are in the same boat in some sense. 

    If every real project you used LabVIEW on had huge compile time issues, builds that fail 3/4 of the time, constantly needing to delete compile cache to build, long load times when switching contexts, slow drag and drop, long save operations, and other IDE usability issues, then asking "Does anyone at NI even use LabVIEW for real projects?" starts to be a valid question.  There certainly has been times when using NXG that I've thrown my hands up, and wonder why someone at NI didn't notice a glaringly missing feature, or a usability issue.  Things that I'd hope any real developer would notice.  With NXG there is at least the valid excuse that NI is prioritizing features.  Sometimes I wish I could just have someone from NI follow me around all day and see the issues I have which aren't show stopping, but annoying enough to hinder development.

    • Like 1
  6. 1 hour ago, Francois Normandin said:

    This is a good idea, but I think it would require a more thorough editorial review of submissions to put them on the such a repository.
    Other than the obvious code reviews, "LAVAG" would need to examine for possible license issues, consistency across versions, develop more precise guidelines for submissions, etc.

    Isn't this already handled by VIPM anyway?  You include the license, and version constrictions in the package.  When you upload code to LAVAG.org you are already authorizing LAVA to store and distribute your code.

  7. For the sake of discussion I envisioned it would be done like Porter said.  We already have a process for uncertified and certified code.  I would just intend on taking all the packages made that are in the certified section and put them into a VIPM repo.  We could think about making the uncertified code into another repo, but I'm not sure how often someone making uncertified code would go through the trouble of making a package.  I mean part of the point of uncertified code is it is semi-unfinished, or an alpha state, so spending the time to setup a palette and build a package could be someone not done until it becomes certified.

  8. That is a good idea, and I suppose we could.  All we'd really need is a Windows based server running somewhere with a professional version of VIPM, serving up to a publicly accessible URL.  That being said I'm fairly certain this site isn't ran on anything that would allow us to do that here.  And then there is the main restriction I can think of which is that only those with VIPM Pro will be able to add this LAVA repository.  The free version of VIPM doesn't allow for adding repositories to monitor.  I still don't have a good picture on the VIPM Pro penetration in the community and wouldn't want to go through the effort of doing this if there are only a few who would find it useful.

    I've decided to open a poll to gauge interests but having never done that before I'm not certain I did it right...

  9. There is also a way to have LabVIEW include the front panel of a VI when building an EXE, that doesn't require project editing and is instead on the VI level.  I'm not fully sure what these VI settings are, but what is happening is the compiler tries to determine if the front panel of a VI is used, and if it isn't then it removes it.  If you can adjust the VI to convince the compiler to keep it, then it doesn't need to be explicitly kept from the project settings. 

    Some of these settings that I believe will convince the compiler to keep the front panel, is to adjust the Window Appearance settings of the VI.  For instance one thing I do is select the Dialog option, and then under Customize I will change Window Behavior to Default.  These VIs will have their front panel kept and I believe it is because things like Show Front Panel When Called makes the compiler believe this is a VI that will be shown to the user.  I think there are other things you can do to, like adding a property node in the block diagram that does an operation on the front panel.  This again makes the compiler believe it should keep the front panel

  10. Thanks for, that.  Looking more at how VIPM handles this I see a minor bug, or lapse of a feature.  If you don't have LINX installed, that License Agreement button isn't there.  Only after you choose to install it do you get prompted to agree to the more detailed license, that's why I assumed it was just BSD2.  Then after it is installed that button for me at least is able to be clicked.  Definitely not clear, but at least there is proof on the internet of someone involved in creating the license, stating the non-commercial use is intended for the deployment to Pi and BB.  Now lets just hope no one has to site that in a court appearance.

  11. 57 minutes ago, PiDi said:

    LINX is not for commercial uses, at least that's what they say in the license agreement:

    Not sure where that is quoted from, but the LINX package on VIPM Tools Network states the license is BSD Clause 2, with the added exception that:

    Quote

    deployment support for BeagleBone Black and Raspberry Pi 2 (LabVIEW 2014 only, non-commercial use).

    The package doesn't include their whole license file but BSD Clause 2 is quite open, usually just stating that the license notice must be retained in some form like an about window.  Not a lawyer, and in my case my end customer is my boss, or my bosses boss.  It is a business setting but I'd consider it non-commercial.  Again, not a lawyer, and I'm open to being wrong.

  12. 3 hours ago, Zyga said:

    It sound like an idea, but it might be very hard to make customer believe this is robust solution. I have never programmed Arduino by myself either, so I would't feel very confident with this solution..

    Keep in mind there is no programming needed (on the Arduino side) if you are controlling it via USB.  NI's LIFA, or better yet LYNX has a firmware that gets downloaded to the Arduino, and then the API they developed you use from the LabVIEW palette.  It also supports several useful sensors, I2C, SPI, and a bunch of other great things.

  13. Is there a reason you didn't upload your images to LAVA instead of hosting it on an external site?  Also in the future post the snippet or VI not an image.  I wouldn't expect you to take a screenshot of notepad on a text based forum since it would make debugging more difficult.

    As for the code I don't see a while loop, how does this continually run?  Adding this along with some wait functions, or a for loop for testing, and shift registers should make things easier to understand.

  14. I'm going to keep going on the extreme cheap side of things and suggest an Arduino.  NI already has an API for doing buffered analog inputs, although I don't know the rates I expect on the order of 100Hz.  Chinese knocks offs exist for about $3 or $4 with free shipping.  Add a buck or two for a USB cable and case, and you have a pretty cheap USB DAQ device that uses NI VISA as it's low level making it work on all NI targets.  Making it ethernet controlled via TCP is possible too, it would just need an ethernet shield and some code depending on what LINX supports these days.  What don't you get?  Fast shipping, expect it to take a month or 2.  Also don't expect support, if you fry the thing there isn't a warranty.  There is no voltage protection either. If you put 60VDC into it that pin is toast, and possibly the whole microcontroller.  Need a yearly calibration for equipment?  Yeah you're going to have to come up with your own calibration routine that gets approved.

    All that being said I have used an Arduino in a real production project...but it was for DIO only, and an unmodified LIFA toolkit which controlled relays in the system through a transistor.  It was pretty simple and just for a cost saving on a low risk setup.  We considered using a parallel port at one point.  As far as I know this test is still in place being used

  15. So my experience has been a bit better, but I totally agree with the comments made here.  Our projects were in 2015 and things seemed good enough.  2016 came around and I tried upgrading only to find IDE performance and was terrible, with extremely long load times, seemingly unnecessary hour glass, and a Save All that would take about 20 seconds per VI.  I reverted to 2015 and told NI about my experience and shared my project.  2017 came out, and it was a bit better so we started migrating projects to it.  Upgraded to SP1 as soon as we could.  It still wasn't at the 2015 level but I just needed me some VIMs.  2018 came out and I've been thrilled compared to 2016 and 2017.  It is hard to say if it is at that 2015 level but in some respects I think it is better.  Especially when switching between contexts of different target types.  This used to take upwards of 30 minutes to go from a project with one target type, to a project of a different type.  Now it is maybe 5 minutes, for this somewhat large project with lots of shared components.  I think the main thing NI had to say that was improved, was the checks on if a thing needed to be compiled.  In many cases this check to see if things needed to be compiled can be recursive, and waste lots of time when it should be clear to a human that it isn't effected.  They tweaked things to make shortcuts in this check that greatly reduced this in some cases like mine.

    I'm always surprised to think back and how personally this issue has always been there.  Trying to use the latest version of LabVIEW has always felt slow to me compared to the previous couple years.  I very vividly remember working in 2011 thinking how terribly slow it was compared to 2009, or 8.6, and how editing code was frustrating.  Then a few years later I remember using 2013 and when I would go back to 2011 things felt so much snappier and easier to use.  And even further with using 8.0 and 8.20 verses 7.1.  It is possible that this is just a sign of computers catching up in performance, and these couple year old versions were made for slower machines.

    • Like 1
  16. I don't fully know what is going on in your situation, but this is one feature NI hasn't talked about but is demonstrated in their own XNodes.  When you look at the Error Ring XNode on the palette you see a green/brown rectangle with the "?!" characters in it.  But if you open this XNode in the XNode Editor from the full path <LabVIEW>\vi.lib\ErrorRing\Error Ring.xnode, then the icon shown for the library isn't the same as the one on the palette.  In NI's case the Help ability for some reason will override the icon in the palette.  This is also seen with the Match Regular Expression.xnode.  I think this is an expected feature but one I'd argue isn't intuitive and doesn't match other things like the Class Icon constant on the block diagram.  We wouldn't want some VI in a class to replace the icon.  If we wanted to update the icon of the constant we'd update the icon in the class.

    I don't have a clue on why in your case adding help causes it to not be able to be dropped.  I'd try to set some kind of constant in the Detailed Help for the Image, Bounds, and Terms, which you can get from your other ability VIs by calling them.  Then set the Use Detailed Help to True.  I think this is because the Help ability is called when needing to draw the icon in places like the help window and palette.  I haven't used the Help Ability in any of my XNodes, it's just what I've seen done in others.  Maybe if these aren't set, your XNode has no icon, and is being dropped but is invisible?

  17. Yeah sorry about that, I wanted to back save to something older, but then if I do that I won't be able to open it and even test that it still works and relinks properly, since 2017 is the oldest I have on my current development machine.  I have since then found a few bugs with pane and panels, and I re-did the cluster object a bit to work better if and when arrays happen but I don't need to to post that until I get something working well.  That being said I don't know what my bandwidth on this is going to be lately.

    16 hours ago, ensegre said:

    I found out a posteriori, you have been there too. As you said there, the code appears as conceived for working on the three platforms.

    Awesome and good work, yeah I do hope that function improves in later versions but for now you can just force it to run and your platform may support it.  If it doesn't then again something like the Web Publishing Tool could be used, where it is generated on a platform like Windows, and then copied to a platform that doesn't support that function, and then it will likely work for Value Only, and possibly Value and Visibility modes.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.