Jump to content

crossrulz

Members
  • Posts

    546
  • Joined

  • Last visited

  • Days Won

    25

Posts posted by crossrulz

  1. 33 minutes ago, Rolf Kalbermatter said:

    Definitely agree that it should have been done like that many moons ago already. But it was in a way maybe easier to do it like that as there was limited testing needed and maybe there was also a bit of an intention behind it as if you wanted to use new hardware you had to install the new driver and that made also a new IDE version necessary if the previous one was to old. So a bit of a hidden upgrade nudge. Now with the new NI structure where the different software departments are very much decoupled and independent from each other it proved to be an unmanageable modus operandi. It's not feasible anymore to have to pester DAQmx (and cRIO and System Control and, and, and) folks to release a new version of their software simply because there is a new LabVIEW version.

    In my opinion, part of the issue was getting confidence in the "separate from compiled" feature.  I would have to dig into some history to figure out when NI started using it for vi.lib.  But shortly after that, it would make a lot of sense to push this in the drivers.  So if you really want to play the hindsight game here, it is something that should have been feasible around 5 years ago.  But NI was putting all of its marbles into NGX at that time.

    Regardless, I am happy about this feature and I look forward to being able to avoid updating drivers purely for the "new LabVIEW version support" and no other changes.

  2. We got some more clarification on the "decoupling driver versions" on the public beta forum: https://forums.ni.com/t5/LabVIEW-2022-Public-Beta/LabVIEW-2022-Beta-New-Features/m-p/4234011#M49

     

    In short, there will be a new common location that the driver will install the LabVIEW Support to and LabVIEW will look there.  They will depend on the "Separate From Compiled" to allow the VIs to be used by multiple versions of LabVIEW.

  3. 4 hours ago, Rolf Kalbermatter said:

    T&M was good for a steady revenue, but nothing that would stand out on the yearly shareholders report. It was unsexy for non-technicals and rather boring. That was one of the big reasons to separate HP into multiple companies. An attempt to create smaller entities that target a specific market segment and could be fine tuned in the sales and marketing efforts to the needs of that market.

    HP spun off the T&M into Agilent, who grew into the Life Sciences industry.  Agilent then spun off the T&M into Keysight.

    Personally, I'm expecting NI to spin off the T&M part of their business into a new company in the next 5-10 years.  Whether or not LabVIEW is part of that, I have no clue.

  4. Your DAQ sample rate is 1 second.  So you need to make sure you are pulling off the data at least at that rate.

    1. Again, change your Timed Loop to a normal While Loop.  It is hurting you more than helping.

    2. Change your DAQmx Read to be N Channel 1 Sample.  This will simplify things a little as you will no longer need the 5 functions just to get a single value out.  You will just need to use Index Array (expand to 2 elements so you can get both channels).

    3. You need to clear the task once the loop is done.

  5. 23 minutes ago, Neil Pate said:

    Yes I know. But this is LabVIEW which does not have null terminated strings. That is what makes this strange. 

    Yeah, but you are talking about pasting into other applications.  How are they interfacing with the clipboard?  We have no control over that.  You will see this same issue if you try to interact with a DLL that uses C-style strings (data will be cut off at the NULL character).

    I should also state that the NULL is copied over if you paste into another LabVIEW string control (yes, I just tested it).

  6. 11 hours ago, ShaunR said:

    I wouldn't put 99.999% of any code written today on a satellite-and that includes my own.

    Go back 20 years and I would swear by any code that was written would have zero bugs otherwise it would not have been published.

    The reliance today on "updates" and "beta" versions  is a detriment to all software. It's like psychologists creating models of human behaviour  and expecting it to model real life...with updates.

    Call me cynical but I think software robustness is a far-cry from what it used to be. But that's progress, right? 

    Code is also orders of magnitude more complicated now than back then due to more computational power being available and therefore more features are possible.  More code just increases the probability of bugs.

     

    About 15 years ago, I was peripherally involved on testing code for a $50k FPGA in a satellite.  The amount of testing that code went through was absolutely insane, including days of just Reed-Solomon test cases.

  7. 3 hours ago, pawhan11 said:

    - No significant new features and roadmaps, unknown future of the language. They killed NXG and announced it on forum post!...  Hope this new NI Conect will change my mind.

    What do you mean "no significant new features"?  2017: Malleable VIs, 2019: Maps and Sets, 2020: Interfaces

    I'll give you the rest though.  I have about as much hope in NI Connect as I did during NI's marketing event last year.  And NI's communication lately has been lacking at best.

    • Like 1
  8. Quote
    4 hours ago, Jordan Kuehn said:

    Yes. 
     

    edit// my apologies. I’m realizing now that I was combining that excellent event with the CLA event in my mind. Perhaps it could happen again in parallel?

     

    I think you are referring to the GLA Summit that happened in November.  VIWeek was in May.  Yes, everything changed last year...

  9. 1 hour ago, Jordan Kuehn said:

    Hmm. Well the stuff that made last year’s great was it was largely community driven with some NI involvement and almost no fluff or upward trending graphs without axis labels/scales...

    I'm left to assume you are referring to VI Week, which was put together in a week or two, completely by the community.

    • Like 2
  10. What I typically do in this situation is use a QMH that manages the hardware.  The control loop can send a message to the device loop using a queue.  The device loop then does whatever needs done and sends a reply back to the control loop.  If no messages are coming in (the queue has a time out), then the device loop can collect data.  What this does is isolate the device to a single loop and only it needs the object.

    The alternative is to use a Data Value Reference (DVR) to store the object.  The In Place Element Structure that acts as the boundary so two operations cannot happen at the same time.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.