Jump to content

Neil Pate

Members
  • Content Count

    855
  • Joined

  • Last visited

  • Days Won

    62

Neil Pate last won the day on August 4

Neil Pate had the most liked content!

Community Reputation

284

6 Followers

About Neil Pate

  • Rank
    The 500 club

Profile Information

  • Gender
    Not Telling

LabVIEW Information

  • Version
    LabVIEW 2013
  • Since
    2004

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. And it is working! Thanks @Darren
  2. @gb119 I am trying to use your HMAC - SHA256 to generate a token to allow me access to an Azure Iot hub. I have a snippet of c# code that works fine and I am trying to translate into LabVIEW but not having much luck. I don't really understand the HMAC stuff properly. Are all SHA256 HMAC implementations the same? The LabVIEW code above is mostly correct, the decoding of the URI is not quite the same but I manually change this to match the c# code. I have verified that stringToSign is identical in both implementations, so something is happening after this. The string coming out of the HMA SHA256 is completely different to the c# code which does not look like a Base64 string at all. My knowledge of this kinda stuff is not good so I am probably missing something super obvious. Any tips? ps: apologies for resurrecting this old thread, I should certainly have started a new one.
  3. Another weird thing. When I close a project it often does not close the open VIs that are part of the project! Maybe my installation is just busticated.
  4. Unfortunately I really need the TLS feature of TCP/IP connections so am going to have to lump it. I agree though 2020 feels like one of the "skip" releases.
  5. Can anyone shed some light for me on the best practices for the FIFO Acquire Read Region technique? I have never used this before, I always have just done the usual trick of reading zero elements to get the size of the buffer and then reading if there are enough elements for my liking. To my knowledge this was a good technique and I have used it quite a few times with no actual worries (including in some VST code with a ridiculous data rate). This screenshot is taken from here. Is this code really more efficient? Does the Read Region block with high CPU usage like the Read FIFO method does? (I don't want that) Has anyone used this "new" technique successfully? For reference this is my current technique:
  6. Following up from some of the information here. This is the bug I am seeing regularly in LV2020 When I close a project I never want to defer these changes, I am used to this option being "dont save changes". See the video, it shows clearly what is happening. As can be seen from the video if I re-open the same project from the Getting Started Window it opens instantaneously which is further proof that it is not actually closed. I really hope this is not a new feature, this is really dangerous behaviour as you think the project is closed so go to commit files or whatever. This has been reported to NI, no CAR as yet. Anyone else seeing behaviour like this? 2020-07-25 18-46-58.mkv
  7. Never seen this kind of thing running on plain metal across many versions of LV. I do often lose my right Control key though if I use VirtualBox as I think that key is reserved for other stuff.
  8. I am about to reimplement my System Status actor from scratch. This time though though I am staying far away from the RT Get CPU Load and am going to try and read it using the linux command line (maybe "top" or similar). Urgh...
  9. How frequently? I must say I cannot recall seeing this, certainly not regularly enough for it to be a nuisance. Can you post a video?
  10. Sorry you are right. Bug report submitted.
  11. More weirdness here. When I close a project it offers the option to defer and now only prompts me to save when I actually close the Getting Started Window. What I normally want is just to close the Project immediately and ignore any changes (you know, like every other piece of software on the planet). NI is this really a new "feature"?
  12. Seriously hate the class mutation history. What a terrible feature.
  13. Currently the only code I have in the My Computer bit is the helper stuff I have to reset the class mutation history. I did not think I could run that from the RT context. It is so weird as I have VIs that have nothing to do with the RT Load VI that want to try and load them when I save them. Like somehow it might be in the mutation history (which it might be as I did clone an actor that had some stuff like that).
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.