Jump to content

X___

Members
  • Posts

    415
  • Joined

  • Last visited

  • Days Won

    28

Posts posted by X___

  1. 17 hours ago, brian said:

    The other day, I wrote up a lengthy response to a thread about NXG on the LabVIEW Champions Forum.  Fortunately, my computer blue-screened before I could post it--I kind of had the feeling that I was saying too much and it was turning into a "drunk history of NXG" story.  Buy me some drinks at the next in-person NIWeek/CLA/GLA Summit/GDevCon, and I'll tell you what I really think!

    First, I'll say that abandoning NXG was a brave move and I laud NI for making it.  I'm biased; I've been advocating this position for many years, well before I left NI in 2014.  I called it "The Brian Powell Plan".  :) 

    I'm hopeful, but realistic, about LabVIEW's future.  I think it will take a year or more for NI to figure out how to unify the R&D worlds of CurrentGen and NXG--how to modify teams, product planning, code fragments, and everything else.  I believe the CurrentGen team was successful because it was small and people left them alone (for the most part).  Will the "new world without NXG" return us to the days of a giant software team where everyone inside NI has an opinion about every feature and how it's on the hook for creating 20-40% revenue growth?  I sure hope not.  That's what I think will take some time for NI to figure out.

    Will the best of NXG show up easily in CurrentGen?  No!  But I think the CurrentGen LabVIEW R&D team might finally get the resources to improve some of the big architectural challenges inside the codebase.  Also, NI has enormously better product planning capability than they did when I left.

    I am optimistic about LabVIEW's future.

    This is material for the Ph D thesis on the NXG fiasco I was fearing (in another related thread on this forum) would never be written... and is now lost to the microcosm of LabVIEW Champions (I understand knuckleheads like us were not intended to be privy to this juicy stuff)

    It seems that one way to give a future (or definitely burry any hopes for LabVIEW at the sight of the quagmire) would be to open source it.

  2. 23 hours ago, X___ said:

    The official statement (by two top brasses, not a mere webmaster) is not particularly crystal clear as to what is forthcoming. Coming right after the Green New Deal campaign and in the midst of the pandemic, it might just as well be a forewarning of major "reorganization" within NI... The surprise expressed by AQ in a different thread would seem to support this hypothesis.

    Maybe that is relevant: https://www.cmlviz.com/recent-development/2020/10/29/NATI/national-instruments---announced-workforce-reduction-plan-intended-to-accelerate-its-growth-strategy-and-further-optimize-operations-cost-structure

    Another variant: https://www.honestaustin.com/2020/10/30/mass-layoffs-at-national-instruments-remedy/ ("Thursday, he said that sales staff and administrative staff would be most affected", so maybe unrelated to NXG's demise after all...)

    And from the top dog's mouth: https://www.reddit.com/r/Austin/comments/jkyid6/ni_is_laying_off_9_of_its_global_workforce/

  3.  

    27 minutes ago, ShaunR said:

    Well. Aren't we a ray of sunshine nowadays :D

    Isn't this serious stuff though? Most of us may have had disagreements with NI on some of the paths they had chosen to go down to, and were lamenting the crawling pace of their progress, but at least we all agreed that the current LV IDE/UI development toolset was outdated.

    The official statement (by two top brasses, not a mere webmaster) is not particularly crystal clear as to what is forthcoming. Coming right after the Green New Deal campaign and in the midst of the pandemic, it might just as well be a forewarning of major "reorganization" within NI... The surprise expressed by AQ in a different thread would seem to support this hypothesis.

    At least it gives me something to speak about in my next "how do we use LabVIEW in the lab" intro session...

  4. I have been using Parallels for a long time. It works but it also hurts (more on this below). As far as subscription vs none, there used to be limitations to the one-of version in terms of number of cores and max RAM (this might have changed, but I don't think so). This is a problem when you have a multicore machine and you end up limited to only a few. There are some additional perks coming with the subscription version (including updates, and a la LabVIEW, the corresponding bug fixes).

    Be warned that their support is nowhere close to NI's, which is a problem when something fails badly (and it will, potentially). Thus, do not upgrade to the newest version until after careful monitoring of their forums, as my experience has been that it can sometimes completely fail and corrupt your VM.

    Which brings me to the most important advice (independently of the upgrade step): do not save critical data - e.g. VIs- as part of you VM (unless you want to backup a multi GB blob every day as part of your Time Machine routing - the VM comes as a humongous file that you will probably want to exclude from your backup routine and only save every now and then - typically before an upgrade), but instead, take advantage of the ability to share data between your VM and the Mac. This way, if your VM becomes corrupted, it's just a matter of getting an old copy of it (with only the system and apps, which typically don't change that often or can be easily updated if needed). Your data files will have been on the mac side of things all along and unaffected by the Parallels fiasco. It has happened to me several times.

    As usual, YMMV.

    As a note, I am not sure this is the best time to switch to a Mac, with the Apple silicon transition, which will at best emulate Intel CPUs, and as far as I know, Parallels adjustment to this is still in the work. I am personally switching to a Thinkpad... and in the long term, Python 🙂

  5. A word of caution (seriously): before you send people in space on a device simulated/tested with LabVIEW, keep in mind that a non-negligible fraction (I am not saying a large fraction, but one bug can be enough) of its algorithms/numerical codes are buggy and since they are closed source (for the most part), the only way to figure out is to run into inconsistencies or unexpected results. This takes a OCD scientific mind to discover, the hard way. Especially in the "uncommon" regimes, those which Dr Murphy likes to invoke at the worst time...

    As users, we've done our best to let NI know, but at best it has taken years for actual bugs to be fixed, when they have been fixed... Maybe time to check those X_Bug_Report tagged posts on forums.ni.com...

    Have fun.

  6. On 9/8/2020 at 8:45 AM, Stagg54 said:

    So we had a discussion during Virtual Coffee about malicious packages and vetting.

    How do we verify that code is not malicious?

    What might flag something as malicious?

    We can use VI analyzer to check for certain things such as:

    • PW protected VIs
    • Removed BDs
    • Call Library Node
    • .NET nodes
    • Network nodes (TCP, UDP, etc)
    • File I/O nodes
    • Shell commands
    • Run on open
    • Subvis with no icon or hidden under other objects

    LabVIEW itself would be highly suspicious, according to these criteria. But wait...

  7. Once the HTML is generated with the provided VIs on the source computer (json files, PNGs and some txt description files plus the boilerplate JS installed with FINALE), that HTML hierarchy can be uploaded on a website (e.g Github). The only thing that is needed is a http server (which is launched by the command http://127.0.0.1:8081 on a local computer in the current FINALE distribution) serving the webpages and running the different js scripts. That could be done by cloud servers at minimal cost.

    This could be provided together with the source code in repositories for those who don't have access to LV.

    While LV is not a text language, a well-designed LV code should be roughly decipherable by a non LV expert, especially when implementing an algorithm (UI and code mechanics is another matter). This would go a long ways toward establishing LV as an acceptable programming language for scientific applications. Currently it is not, because most scientists using Matlab, python, C, etc. can't even look at the code.

     

    Here is an example of how this looks like:

    1907288695_ScreenShot2020-08-05at09_26_31.png.a602a25fbc76b9f168521580cf81dc77.png1860601960_ScreenShot2020-08-05at09_27_27.png.9830884e40a4d83b990cd5cb41a67989.png

    Note that vi.lib and other pre-installed VIs are not accessible and I don't see the conpane of the other VIs...

  8. On 8/1/2020 at 5:20 AM, Rolf Kalbermatter said:

    Interesting theory. Except that I don't think LabVIEW ever got below 35 in that list! 😀

    Are you seeing things between the lines? LV is not present among the first 50 listed...

    In case that helps shedding some more light on reality, here is yet another recent list essentially saying that LV is nowhere to be seen on Github or Stack Overflow:

    https://redmonk.com/sogrady/2020/07/27/language-rankings-6-20/

    and I am not talking about the list itself, but the graph shown here: lang-rank-q320-wm.png

    doesn't list LV, as far as I can tell.

    Now I am sure that if you limited this kind of analysis to NI's website or LAVAG, LV would hit top of the list in both cases...

     

     

  9. That looks insidious indeed. It definitely doesn't work this way in 2019 SP1. This is the kind of mental note (close LV and restart to clear up memory) I would certainly forget every now and then.

    I am refraining from upgrading without pressing reasons, since chances are I will lose access to support for old NI hardware I haven't replaced yet (by non-NI ones, of course!). I realized too late I should have stuck to 2018 SP1 for hardware reasons (and as I described recently in the thread you linked to, because of a newly introduced project explorer bug among others).

    2020 seems to be one of those years we will try our best to forget about ASAP... 

  10. Parallels has a few bugs up its sleeves too, so I am willing to lean toward that conclusion too considering that no one seems to have observed this.

    Is anyone using Parallels Desktop in "Coherence" mode and observing this phenomenon? There is no Windows' real estate per se in this mode, each Windows app's window appears as a regular macOS one (with the Windows style though), so I can't really imagine that the OSes could be mislocating the cursor... but this is all beyond my technical knowledge to really ponder.

    Thanks for looking into it though.

  11. Here is the video illustrating the symptoms.

    Of course the demo effect wants that the first time I try to drop the subVI, the cursor looks fine (I had just verified on a brand new VI that the symptom appeared right away, and after I went through the recording hoops, I had to "work" for it to show up), but a brief excursion with the cursor over the project (from which the subVI came, so indeed I could not drop it again in there) was enough for the cursor to turn back to the "forbidden" icon.

    I am not sure that is very visible in the video, but there is a little bit of a "stuttering" and you can usually very briefly see the arrow+ cursor alternating with the forbidden icon.

    The bottomline though is that I cannot drop the subVI in most of the virgin diagram and have to actively search for allowed drop zone.

    I have come to get used to it now, but it is definitely getting old and a productivity hog.

     

  12. The title is made up, but explains what I have been experiencing for years, hoping that it would be fixed in the next version, but it never has, so I am starting to suspect nobody is caring or possibly maybe nobody even noticed it?

    Anyway, the symptoms are:

    when I drag a VI (from either a palette or from the icon of an open VI) into a target VI's diagram, I am frequently encountering this odd and annoying 🚫 symbol where my cursor is (it's not red and it is slanted the other way, but this is the closest emoji to the real thing I could find), instead of the "androgynous" cursor (a mix of ♀️and ♂️) which tells me that I am going to copy that object where the cursor is.

    I would move the cursor around, seeing a 🚫 wherever I go, until I would fleetingly grasp a cursor with the + index (the "androgynous" cursor) over some random location, and then, painstakingly try to go back to that region to find the sweet spot (pixel really) where I am able to drop the VI. Of course, once dropped on the diagram I can move the VI anywhere where I was forbidden to drop it during my initial attempts.

    That's got to be the most annoying bug in a graphical programming environment ever...

    Am I the only one to experience this?

    I am using 2019 SP1 64 bit, but that has been around for several versions already.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.