Jump to content

Neil Pate

  • Content Count

  • Joined

  • Last visited

  • Days Won


Everything posted by Neil Pate

  1. Everyone is in for a wild ride with NXG. For extra programming efficiency, speed and consistency the data has shown it is actually better to have two totally different context menus that appear at the same time. Even better, some have pictures, some have text. Take that brain training! Really looking forward to NXG6.0, we might have 3 different menus!
  2. It's a real shame that more of this stuff is not open to viewing. I often wonder how something is done and try to peek inside the VI. Often it is just a bunch of CLN calls which is fine, I poke around to see what other functions are exported in the DLL but that is about it., I get bored after a short while and have bills to pay so have to move onto real work. We know if screw with undocumented stuff it will break. No API is perfect, but at least let us see!
  3. The more I think about hooovahh's idea the more I like it. Maybe I am thinking about this all wrong and should just embrace the dynamic IP address issue. As long as my cRIO can talk to my cloud then it can store its own IP address somewhere in there. Is this how the IoT hubs work? Like azure IoT?
  4. Thanks everyone for the into. This sounds like the kind of thing that is easy to screw up and I cannot really afford that. I have got one chance to get my system right, it will be deployed far far away. Does anyone have any recommendations for companies that offer this kind of advice as a (paid for) consultancy? I can do all the LabVIEW development myself but I need good solid advice on choice of hardware and basically IoT related stuff like MQQT and pros and cons of the different IoT platform cloud vendors.
  5. Wow that answer is incredible, so much to digest. Thank you so much for your insight, I clearly have a huge amount to wrap my head around.
  6. Hi all, I wonder if anyone can share some advice for me. I am working on a new project that is the pretty standard cRIO + Windows PC combination. The network topology is not super complicated and I have tried to diagram it. The use case I am trying to solve is this: How can I connect to the cRIO and the Windows PC from my dev PC which is connected via the internet and a mobile phone network? Phrased another way, how do I assign static, internet facing IP addresses to the cRIO and Windows PC? In my diagram all the IP addresses are totally made up, but are just to prove a point. I
  7. Neil Pate


    I also use the Android emulator, but normally for nothing more complicated than multiplication 🙄 (tragically this is the life of a 40 year old engineer...) I still love RPN though ,even if it is just for addition and multiplication! My wife actually has the next one up, I think it is the HP49G. It has the rubber keys and is terrible! I love the solid clicky hinged keys on mine. One of the most terrifying moments of my life was when I was taking a leak before an exam in 4th year EE and I dropped my calculator which bounced off the concrete floor. I had to finish my pee, then slowl
  8. Neil Pate


    Anybody remember how to use one of these? About 23 years ago I was pretty handy with mine, and today I could not even figure out how to convert degrees to radians! Has there been a better RPN compatible calculator since then?
  9. @Thoric I have seen similar behaviour before. There are two scenarios I have seen. 1. It turned out that some code I had naughtily left in a diagram disable structure was "out of date". If I recall it was a class method that was broken or perhaps the class had "stale" data in its history. Resetting the class mutation history sometimes fixed this. 2. Are you sure you don't have any broken methods (like perhaps a test or prototype VI in a class somewhere that you never got around to updating) in the project? This kind of error is hard to track down as everything will run fine in L
  10. I feel your pain. I did a long private session about two years ago with some people inside NI where I showed them my current workflow and how I did not think it was going to work in NXG. Nothing at all changed as a result of that feedback.
  11. A pack of hungry wolves could not hold me back. I am just waiting for the thread Jeff said he was going to make.
  12. It was not done under a work-for-hire arrangement, the product is completely my own IP (apart from the third-party toolkits I use of some of which are closed-source). Thanks for all the advice everyone. I do want to release this now I just need to find time to tidy up and strip out stuff I cannot distribute.
  13. Bullet 1: yes that is fine. I really don't expect anyone to get rich from it, if they do I would hope they would feel bad enough to give me something back (but if they didn't this would not bother me too much). Bullet 2: That would be nice, but given the nature of this community I would be quite surprised if it happened enough to worry about Bullet 3: No, that I don't want. BSD-3 sounds like a good compromise. My understanding of the industry after being in it for approaching 20 years is that nobody gets rich quickly and without a lot of hard work. I doubt any of the toolki
  14. Thanks Jeff. All of us here just want NXG to be awesome, so anything we can do to help get it right (in our opinion) I am sure we will do.
  15. Never heard of that license. It pretty accurately sums it up though. Thanks!
  16. I don't really want to hand edit 700 VIs. As I said, if I do release it I really don't mind what others do to it. Sorry my question was a bit vague, I actually wanted to ask if I *have to* put a license on every VI. I would be more than happy just putting the "do whatever you want" license (probably MIT License) on the GitHub page if that was sufficient for me to not get sued if someone's heart lung machine stops working due my code.
  17. Hi gang, I have been working on an application recently and as an experiment I am thinking about open sourcing the whole shebang. Highlights of the application are as follows: Multiple camera vision inspection and analysis Analysis of images using traditional NI Vision and also GPU accelerated TensorFlow Storing of detected objects in a SQL DB Serving up access to this DB via a simple WebService NXG WebVI implementation of a simple data dashboard Uses my home grown lightweight actor framework (not NI AF) and includes a simple subpanel based GUI a
  18. Yes but the last two links are what make this tragically ironic...
  19. Yeah as others have pointed out, the nodelay technique is for getting millisecond type response times. Something else is going on in your system.
  20. I am pretty sure it is on a per socket basis. Setting that should have no effect on any other applications.
  21. In order to resize an array you have to click on the tiny dot in the middle of the drag handle that only appears if your mouse is leaving the control and is some weird distance way from the border. In other words, in this picture in order to get the resize drag handle to appear I have to move my mouse over the control, then out a little bit into the middle of space with no visual indication how far out to move it until the horizontal bar appears and then find the tiny dot in the middle and then drag that. 🤮 How could this UX make it through 8 years of development? 2020-05-07
  22. As seen in the other thread, I am pretty negative about NXG. However I would like to temper this with some things I have seen in NXG which I do like Vector based GUI. I don't like the choice of colours, fonts, widgets or pretty much any of the iconography but I do think the switch to vectors is an absolute necessity for a modern application running in high DPI settings. Zoom! I was pleasantly surprised when I accidentally zoomed on the Front Panel. In current gen to get pixel perfect GUIs sometime I find myself resorting to the windows magnifier or dropping down to 1024x768, so b
  23. I do not know the details of this messaging protocol but I can say that if the TCP/IP transmit packet sizes are small Windows will, by default, wait a bit to see if there is any more data to be sent so it can transmit it a bit more efficiently. See here. This is usually more annoying than helpful. Happily it can be turned off. Take a look at the TCP_NoDelay VI in this library. I did not write it and I am not entirely sure where I got it from but it works really nicely. Win32 Util.zip
  24. Fair point. I will do a mass compile and re-run the experiment. (Sorry ShaunR, I guess you were also right)
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.