Jump to content


  • Content Count

  • Joined

  • Last visited

  • Days Won


Posts posted by hooovahh

  1. I'd like to see some probes just to make sure those are the actual values going in, since it might be what you think is going in but isn't.  Assuming that is fine I'd check to see if it works on a new file and not one that already exists and has a different data type for that group channel pair.  Also the reference is valid obviously, but was it opened as "Open Read-Only"?  If this is a subVI I'd just unit test this by writing a quick test that creates a file and writes to the channel.  Nothing obvious is wrong with the code you have shown, which makes me think it is an issue somewhere else.  Oh and is your timestamp array empty?  Or is there an error being generated from that waveform function?  I think the TDMS function should just pass the error through but I'd probe it to make sure.

  2. Well for me it would be a bit more complicated since I have Pre/Post Install and Uninstall VIs.  And some of those VIs depend on VIs in some of the packages.  So install order matters and dependencies would need to be checked so the order is right, and Pre/Post Install things ran in the right order, with the right input for things like Package Name, Install Files, etc. put into the variant input of those VIs.  Still possible and adds some complications.  Which is why to date I still just install VIPM and let it handle it.

  3. Yeah sorry for the confusion.  I don't have official release process in place for this community stuff and I really should.  I did make a newer version of this for the TDMS package and just included it in it because I wanted to make consuming my code easiest for other developers.  But with the efforts of VIPM, and GCentral I need to evaluate the best way to share and hold myself to the process to avoid confusion in the future.

  4. I think we're at the point where to better help you need to post the actual VIs.  I suspect lots of coding styles that are indicative of beginner LabVIEW programmers and I can't make blanket statements like "Don't do this" without being able to see more of what you are trying to do.

    Like my initial reaction is that timed loops shouldn't be used except on RT or FPGA platforms.  I also see lots of booleans that I would suggest using an event structure for capturing their value change.  Maybe putting them all into a single cluster and capturing that value change would be better.  

    Other suggestions are to use a state machine, or queue message handler.  This usually has a cluster of stuff to carry between cases and can be thought of as something akin to variables.

  5. Years ago as a proof of concept I made a LabVIEW program that installed packages without needing VIPM installed.  As Michael said the files are zips with a specific structure, and spec file defining where the files should be placed, and optional pre/post VIs to run.  Obviously VIPM can do more things like track down packages that are dependent from the network, resolve version information, and a bunch of other things.  But if you just have a VIPC with all your packages in it that you need to install, it probably wouldn't be too hard to make a LabVIEW program that just installs that.

  6. 2 hours ago, Antoine Chalons said:

    I was wondering if anyone has seen similar situations?

    I haven't, and my initial reaction is that I would quit a job like the one you described.  The closest I came was when my boss sat me down to show me his 5 year plan for the group since he heard I was planning on quitting soon and he hoped his new road map would inspire me to stick around.  I thanked him profusely for showing me his plans, because it meant that I knew I was making the right decision leaving.  I quit, he started steering toward an ice burg, then he was fired about a year later.  I was contacted by a head hunter saying this job opportunity was a perfect fit for me.  I had to inform him I had that job and listed reasons why I wouldn't come back.  Someone else that was still there tried getting me to come back, only to realize they were one of the reasons I left.  Things have not sounded all that great since.

    • Thanks 1

  7. So there are plenty of improvements I can suggest for readability, but overall I think the code isn't bad.  A few comments would help, and I'd prefer less feedback nodes, and instead code that calls it in a loop, but to be fair that might have just been what you did to demo it.  I think all that is needed to to try to not use the dynamic data type.  It can do unexpected things, and can hide some of what is going on, inside those functions that you need to open the configure dialog for.  This can make the code less readable.  So I'd get rid of the To Dynamic data, and instead use the Array Subset function on the 2D array of doubles.  This should allow you to get just the two columns you want, and write that to the TDMS file.  You may have to mess with the decimation, and possibly transpose the 2D array.  I can never remember how the functions work, and so I'm often writing test code to see if the columns end up the way I want.  This will mean of course that you also use the Array Subset on the 1D array of string which is the channel names.

    As for the time entry, it looks like it works, but might I suggest a different string format?  Here is a post that shows how to use the Format Into String, with the timestamp input, which returns an ISO standard string format.  More standard, and if a log is moved from one timezone to the next, it can still be clear when the log was made.

  8. Yeah targeting the upper management likely would result in more sales.  But may also result in less engineers willing to use NI and their products, having been force fed the wrong, or less appropriate tool for the job.  I worked with a group that had Veristand 2009 forced down from upper management.  NI came in with a dazzling presentation and convinced them to change over their dyno control systems to it and the PXI platform.  They did without consulting the engineers, and as a result there was a daily battle where I would repeat "Rewrite Veristand" when some trivial task for LabVIEW, couldn't be done in it.  Veristand 2009 ended up being somewhat of a flop, and from what I remember nothing ported to 2010 since NI made some major rewrites of it.  We eventually got an apology from NI which basically equated to, sorry we gave you the impression this product was more usable than it actually was. 

    Luckily my current boss knows NI and their offerings, but trusts my opinion over theirs.  They need to convince me something is a good fit, before they will convince him.  They have some new test platforms that directly align with our business (battery test in automotive), and so far I'm unimpressed except when it comes to hardware.

  9. 4 hours ago, drjdpowell said:

    So... did anyone have a 1-on-1 interview with NI? 

    I too had one. The day of the meeting I installed NXG and went though all the old issues I had with it years ago and tried evaluating what things have changed and what ones haven't.  I made a bunch of notes on things and mentioned them in the meeting.  I then have started formalizing these lists and have been giving feedback to NI, most of the time linking back to where the complaints were first mentioned.

    • Like 1

  10. I do remember discovering the Window Monitor and Ned windows semi-independently.  Someone posted a way to show the Heap Peak window, with CTRL+Shift+D+H and I thought that was likely for Debug, and Heap.  So why not try D+<every letter combination>.  At the time I think only H, N, and W did anything.  Most of the time I care to find a window's HWND I use a private function which lets me do what I want.  I have never had a need for knowing the HWND of any of the other windows so that Window Monitor wasn't all that useful for me.  The Ned windows unlocked more useful information, but just for curiosity sake really.  Also I find it really interesting that Heap Peak has been in LabVIEW for such a long time and with seemly no or little updates to it.

    • Like 1

  11. As I mentioned in that thread tunnels out of a loop, and case select terminals are the major issues I see with it.  I'd estimate I probably have wanted to create a constant, control, or indicators from those conditions probably 5% of the time.  The other 95% I want to change the tunnel options or make that datatype the selector input.  I was told in the beta I was wrong, and that users want to create one of those things more than any other thing, in all situations.  I mentioned this in that thread and Darren at least acknowledged that in some cases the developer likely doesn't want to create anything but being consistent is important.  If being consistent is important then I'd say that is a vote to get rid of Auto-tool.  Hover here to have this tool...no not like that click here first.  Or just tab and get what I want.  I'm still primarily in 2018 but consistently have issues in 2020.

  12. You'd have to ask NI, but I suspect they don't want to have to support users calling the functions directly, and instead they only want to support the users calling the VIs that they create, exposing the functions.  It gives them tighter control, and likely means less unit testing if they know a function will only ever be called a certain way.  For instance the Write PNG File has no compression.  The function internally does have a compression input, and setting it does create a more compressed PNG image.  But it is possible that compression isn't tested or supported on all platforms.  So they can either support compression on some targets and not others, either making duplicate VIs, or erroring out on those targets.  Or they just take that ability away.  Well that or someone just didn't realize that input would be requested and left it off the connector pane.

  13. Yes I'm sure.  I just opened the block diagram of the Add File to Zip, and Unzip functions in 2020 and I believe they are unchanged from 2018 at least calling functions from within LabVIEW.  NI is calling system exec to touch the zip file, setting the file attribute that I believe is the file creation date, of the file that was uncompressed.  Obviously the date will be right now since the file was just extracted.  After it is done being uncompressed NI wants to touch the file, so it has the creation date that is stored for that file within the zip.  This can be seen in the unpassword protected VI Set Unzip File Date Time.  The VIM I linked to earlier shows how these call library nodes create the zip.

  14. 1 hour ago, X___ said:

    The NI_Unzip.lvlib VIs are all password protected (probably because underneath, they are using some .NET calls they don't want people to mess up with). So yes we don't know what they do, although the help tells us what they can't do:

    It is a series of call library nodes.  I think if it were .NET NI would have a hard time getting it to run on other targets like Linux, Pharlap, VXWorks, and issues with ARM/x86.  In the two examples I linked to I show how these are called to do compression with an array of bytes by calling them.  Still a pure G implementation would be nice, just looking over the source and documentation is quite daunting based on my lack of understanding compression techniques, but those links do help.

  • Create New...

Important Information

By using this site, you agree to our Terms of Use.