Jump to content

hooovahh

Moderators
  • Content Count

    3,017
  • Joined

  • Last visited

  • Days Won

    215

Posts posted by hooovahh

  1. So yeah I've struggled this same issue with TRIPP-Lite UPSs.  Mine luckily had a DB-9 so after investigating a bunch of time into getting a proof of concept going on the USB side, I decided to just drop it and go with the simple COM port.

    NI's official method of talking to HID devices involves creating an INF driver for it, and then using that in Windows.  The problem is Windows 10 is quite restrictive when it comes to unsigned drivers, and you need to do some extra work just to install the driver, so that the hardware can be shown in MAX, so that you can start to communicate with it using raw VISA.  This isn't that big of a deal for one, but multiply this manual process by the 20+ machines I'd want to do this on and it is an issue.  Here is the process NI suggests.

    At the time I did find some DLLs posted on the NI forums that wrapped communicating to HID devices so that this wasn't needed.  It was incomplete, and my knowledge of DLL calls, and C is limited so I just never found success in this method.  I don't remember the code or examples I used but here is a thread that came up when I was searching.  If you do get anything working feel free to post it.

    EDIT: Oh I just saw a command line tool that might help.  I'm not in the office to test with.

  2. There's lots of examples of this posted on NI's forums.  Here's one I've been using for a while:

    https://forums.ni.com/t5/LabVIEW/crc-8/m-p/580831#M272003

    Note that if you are using this for the CRC calculation in an automotive CAN frame you may also want to add in the ability to skip the CRC byte location, as I've seen CAN-FD frames that have the CRC not at the last byte in the payload.  In some cases the CRC stops being calculated once it hits the CRC, and in some cases I've seen the calculation skip this byte, and continue with the test.

    Also if you are doing this calcuation and you happen to be using XNet hardware, you can actually have the hardware perform the CRC calculation for you. Here is the idea exchange on adding this feature, and here is a blog post talking about the unofficial way to get the hardware to do this today.

    • Thanks 1
  3. Currently NI's package manager is less suited to distributing LabVIEW reuse in NXG, when compared to VIPM in LabVIEW 20xx.  This is partially because in NIPM the package must be built for a specific version of NXG.  So a package made for version 3.0 can't be installed in version 5.0, until the creator of the package updates it and rebuilds it for version 5.0.  For reuse to be more adopted in NXG, this limitation needs to be removed.

  4. Thanks for the info-labview quote, the decline of Silverlight and NI's inability to pivot is a shame.  Especially when posts by NI on the forums hinted that HTML5 features would replace Silverlight tools.  I interpreted that as being transparent to the users of their systems.  Like update to the newest version of MAX and all the tools that did use Silverlight have been replaced with their HTML5 equivalent tools.  We are 5 years after that post, and it is a shame that remote front panels still use a technology so unsupported.

    Also you do know those last two links are April fools jokes right?

  5. In the past I've brought up issues like "You aren't listening to our feedback on the UI".  And someone at NI reminded me that NXG started out very different looking.  In fact (I hope I can say this) the UI actually had a ribbon interface for a alpha release or two similar to office products.  NI claims they listened to our feedback and started over with the UI that contextually pops in on the right.  In my opinion, I think that NI would have moved away from ribbon interfaces on their own, just because it had technical limitations, and didn't scale well.  This this is an example of the users complaining a lot, and NI changing it for the better.

    EDIT: Oh and @Mads made a point about how much harder it is to get new customers, than retain the ones you have.  I'm not a Linux or Mac user.  I probably will never install LabVIEW in either of those environments.  But current LabVIEW has some users that do, and zero of them would be supported in NXG.  From NI's perspective, what is the effort needed to support them, and what percentage of users can migrate to Windows?  I actually lost a bet (pretty badly) with Michael about this.  I made a bet with him that one year after NXG 1.0 was released NI would have a Mac or Linux version.  We had both been part of the Alpha/Beta of NXG and I figured they were just prioritizing Windows until a stable release.

    • Haha 1
  6. Using NXG I don't feel like I am starting over, but do feel like I am programming with one metaphorical hand behind my back.  I have been part of some of the hackathons NI has hosted at NI Week.  There it was important for NI to see developers using NXG for the first time, and seeing what things they could figure out on their own, and get feedback on things.  There was definitely more than one moment where I would call someone over and be like "Look I can't do this thing I always do in current LabVIEW" and they would write things down and without giving the answer they would hint that it was possible.  After a few seconds of looking in menus, and right clicking several times I would discover it is possible but in a different way. 

    I don't want to sound like I am defending NXG.  I've used the web technology in a grand total of 1 project and that's it so my experience level is quite low.  And there were several times I would tell someone at NI "Look I can't do this" and they would look confused and ask why I would want to do that.  I'd then give several real world reasons I need to do that or I can't migrate to NXG.  Which probably why my usage with NXG on real projects is low.  Keep the feedback coming.  I want to hear what struggles other people are having, just like I assume NI does.

  7. (Thank Michael).  I do see this as a slight issue.  For now the new LINX is only in the Community Edition, so this one subforum will support both sets of topics.  In the future LINX will be its own updated package on the Tools Network, and won't necessarily be part of the Community Edition.  That being said I don't think we will be making another subforum just for LINX stuff.  I expect the majority of Community Edition topics will be related to LINX, and splitting LINX into Community and Non Community subforums would only split the conversation up.  For now the Community Edition subforum has pretty icons showing the Pi, Arduino, and Beagleboard.  This will hopefully drive people wanting to make topics on this hardware, into that subforum.  Co-mingle away.

    • Like 1
  8. Very good thread, keep up the discussion.  I just wanted to chime in on one thing with my opinion.

    16 hours ago, pawhan11 said:

    Some of my thoughts after working 6 years as 'Labview Developer'

    • LV did not change much over last 6 years, things that I remember were maps, sets, vims and independent runtime engine

    I think I understand what you are trying to say with this.  But it seems with every version of LabVIEW, NI focused on at least having a couple important bullet points with every release.  They likely have to split resources between current and NXG flavors, but just for my own categorization I made a list of features that seem important to me with each release.  Looking over the release notes of each version of LabVIEW it is clear that each year lots of work goes into each release.  Its just that some years are more packed with features I care about than other.  

    2020 - Interfaces, 2019 - Maps and Sets, 2018 - CLI and Python support, 2017 - VIMs, 2016 - Channel Wires, 2015 - Right click Framework, 2014 - 64bit support in Linux and Mac, 2013 - Linux RT OS for RT targets, 2012 - Concatenating and conditional tunnels, 2011 - Silver Controls and Asynchronous Call by Reference,  2010 -  VI Analyzer and PPLs, 2009 - New Icon editor and Snippet, 8.6 - QuickDrop

    Fundamentally dataflow concepts don't change, which is a good thing.  A LabVIEW developer who started on LabVIEW 8.0 could probably do programming in 2020 just fine.  They will discover all kinds of things they never had but it won't be like starting over to them.

    • Like 1
  9. I'm not sure what goes on in the subVI, but I'm pretty sure that method is slower than 4 primitives.  That's the major benefit of that method.  Yes there is the draw back of having so many cases, but when they are buried in a subVI never to be seen, and the likelihood of having a cluster with more than 256 top level elements?  I'd personally still prefer the VIM, given the trade off to performance benefit.  But thanks for the alternative.

  10. Yup, looks like what I expected.  I assume you mean avoiding non-NI XNodes in your code, because there are several people use all the time.  That's fair and when all things are equal I do prefer a VIM over an under(un)supported technology.  I do also agree that the number of files are large even for something simple.  Anyway thanks for sharing.

  11. Thanks, great incite, and great suggestions. So far that means I see 4 possible solutions.

    1) Use a non-strict VI reference and do the To Variant-To Strict VI reference dance I showed in the first post (this does work BTW) 2) Have the terminals be variants, then use variant to data with my class in the VI, and then in the VI with the Wait On Asynchronous Call. 3) Use a queue or some other reference to get the same data, without using the Wait On Asynchronous. 4) Tim's solution.

    I've gone with the Variant terminal solution 2.  The calling and closing of the asynchronous VI is controlled by two private VIs in that class and should always return no error.  Had I realized this wouldn't work from the start I would go with solution 3 using probably a DVR, but I already had the VIs written and just needed to add some variant to data calls.

  12. Okay so I have a normal class.  In that class is a private VI that I will open a reference to and run it asynchronously using the Static VI Reference, an Open Reference by name, and a Start Asynchronous Call.  All normal and all good.  I realized I might want to capture the response from this VI once it finally returns in the Close so I keep the reference to this VI in the class.

    1555973807_StartHelper.png.6b24e84e0cec43fc586f27855646dc8d.png

    Now for me to be able to get the response using the Wait On Asynchronous Call I need to have the VI reference be strict, which includes things like the terminals used.  Notice I have a coercion dot in the image above because the Helper Reference in the class is a normal non-strict VI reference.  As soon as I change this to be a strict reference my private data type has an error "VI Refnum 'Helper Reference': Private data control of this class uses an illegal value for its default data."

    So for now I have a non strict VI reference, going to a variant, then variant to data with the type I want, and things seem to work.

    Close.png.48dcc54e5009e59ad3b9e93c25a39cde.png

    Is this just some kind of recursion issue that LabVIEW can't resolve?  Is there a proper way of doing this?  I also just thought of another solution, that my terminals in the VI could be variants, and then just convert to and from the class data.  Is this okay?  Thoughts?  Thanks.

  13. Okay so I wrote some code back in the 2011 era for doing some graph stuff and never used it.  As a result there are a few places that the code could take advantage of modern functions (limited events, array tunnels, conditional, and concatenating, VIMs, even Set and Maps) but in any case I have it here for others to take a look at and use as they want.  I don't intend on updating this further.

    It all started when I found the built in graph controls to be limiting in terms of signal selection and control.  I wanted a way for a user to select the signals they want and then show those on a graph with a shared time scale.  The problem was at the time the checkbox selector on a graph could have a scroll that couldn't be controlled.  So I started with a single column listbox showing all signals and allowing multiple to be selected.  I wanted to see the current value so I added that.  Scope creep kept going until I'm left with this thing that isn't done, but isn't terrible. 

    In this demo there is a subpanel mode, independent windows, pause and resume, the normal graph palette controls, independent Y axis scaling, coloring, buffer size control, visible signals selection and values, and a few other things.  It was intended to be used in places where speed and exact values weren't used.  It was more or less a place where all signals of a system could be seen slowly.  It uses a few things I've posted on LAVA before.  My Variant Repository, Array VIMs, and Circular Buffer.  Here is a video.

     

    Circular Graph.vipc

    • Thanks 2
  14. There are two major schemes in SCC.  Lock-Commit, and Merge.  It seems most text based languages don't bother with Lock-Commit since an intelligent text merge can be done pretty easily.  Since LabVIEW's VIs are binary, a merge can't really happen on a file level.  The Compare and Diff tool NI has made does help, but I've found it at times to be messy.  A more rigid approach is the Lock-Commit.  This works best when an application has been broken up into sub modules, most often Libraries, or Classes.  Then multiple developers can work on separate files at the same time, but lock them so that the file can only be edited by one person at a time.  This does take the cooperation of all developers.  Blanket statements like "Don't lock the whole project" needs to be something everyone does.  Otherwise you will be calling up someone telling them to unlock the files they have.  I've gotten used to this over the years, but if you are a new to LabVIEW and you have 1 or 2 huge VIs, then locking them to one user will cause problems.

    As mentioned NXG uses XML as the file format (with a few binary blobs when needed) and merging them in text has some varying level of success.  But with a relatively complicated XML file merging might not always do what the developer expects.

    • Like 1
  15. Very neat.  So I wanted to update this to return all monitors, and all Windows, and Panel sizes.  I also saw that this was using scripting which means it won't be available in the run-time engine.  Attached is an updated version that I believe does this. (2018)  I also added a feedback node to return the previously read data if it is called again, with an optional input for a full refresh.  I did this since I believe changing monitor position and resolutions after and application has started is rare.  Still if you do this often you can just wire a True to this.  Another option would be to use the Elapse Time, and maybe do a full refresh once every couple of seconds.

    One thing I also removed was the passing in of a VI reference to get the application instance to use.  I wasn't sure why this was being done since regardless of the application instance the monitors and panel bounds will be the same.  I realize AQ and Darren often work in private instances, it's just in this case I didn't think it would matter.  Please correct me if I'm wrong.  I also left in the VI description stating it is thread safe, but am unsure if it still is.

     

    Compute Maximum Desktop Bounds Hooovahh Edit.vi

    • Like 1
  16. Thanks for your contribution.  A couple of things.  Using polymorphics for this type of thing can become a pain pretty quickly.  I had a similar thing for reading and writing variant attributes and used scripting to generate the 60+ data types I supported.  But even then there were times that the data type wasn't supported.  This also added 120+ extra VIs (read/write) adding to loading overhead.  The more modern way of doing this is with a VIM that adapts to the data type provided.  Your VIs were saved in 2015 when VIMs weren't an official thing, but you say you use 2018 where it is.  Back then I'd still support doing this data type adaption with XNodes.  Posted here is my Variant Repository which does similar read/write anything including type def'd enums and clusters.

    Putting these in a global space that any VI can read and write from is pretty trivial.  All that is needed is a VIG, functional global variable, or even global variable in a pinch.  This will keep data in memory as long as the VI is still in memory and reserved to be ran.  This has other benefits of easily being able to load and save data from a file since it is all contained in a single place.  Also with this technique there are no references being opened or closed, so no memory leaking concerns.  Performance-wise I also suspect your method may have some room for improvement.  If I am writing 10 variables 10 times, looking at your code that will mean 100 calls to the obtain notifier, 100 calls to the send, and 100 calls to the release notifier.  I suspect reading a variant, and then calling the set attribute 100 times will likely take less time and processing power.

  17. 1 hour ago, Taylorh140 said:

    Let me know if you find a perfect way to handle scroll synchronization. :)

    Far from perfect, but what I have at the moment, is on Mouse Move, Mouse Down, or Mouse Wheel event (with the limit to 1 just like you), read the Top Left Visible Cell.  This gives the Tag and Column number.  Using a second property node, write the Active Item Tag, and Active Column Number to what was just read.  And then read the Active Cell Position Left.  I then use a feedback node to see if the Top Left tag, Column, or Cell Position Left has changed from the last time the event was fired.  If it hasn't do nothing.  If it has, then go do what it takes to either shift the current image up and down, or left and right.  Left and right are property nodes on the controls to move them, but value can stay the same.  As for shifting and image up and down, I use Norms low level code found here, which is pretty fast, but I did have to add a couple of op codes to it since his post.  Alternatively you might be able to set the image origin to get the same effect but that uses a property node, as apposed to the value of the image control.

     

    • Like 1
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.