Jump to content

Michael Aivaliotis

Administrators
  • Posts

    6,196
  • Joined

  • Last visited

  • Days Won

    103

Everything posted by Michael Aivaliotis

  1. Don't forget cRIO has RT so you can implement a lot of programming on the RT side for various scenarios. As a master of the LabVIEW language and NI hardware. I would prefer a platform that allows me to use the language I already know and love. I can provide a PC application that implements anything the customer desires. then I can configure a cRIO and implement ECAT, serial, Devicenet, DAQ, DIO or whatever they want using the same language and skills. If NI does not support the desired hardware i need to talk to, they have many other options like DLL calls and other ways of interfacing that I still have not found a specific limitation that I was not able to workaround to get the job done and still make the customer happy. Yes, there are other languages and opportunities. Anyone who has worked with ECAT has heard of Beckhoff. They pretty much came up with the standard and are pushing it across the industrial world. It's just another communications standard. NI and LabVIEW are at a whole other level above and beyond that. Can NI improve the tools they provide for ECAT, DeviceNET and others? Yes! There are some features of ECAT that the NI tools simply cannot access or configure. A lot of what NI implements is the basics of the industry standard. They rarely go above and beyond unless customers push for it. They have a checklist of industry compatibilities they try to maintain so the marketing looks good. So they still can do better. Recently they started adopting TSN which is very powerful and allows synchronized DAQ across cRIOs and cDAQ chassis. Technology is constantly evolving and I commend NI for always trying to keep LabVIEW on the forefront by providing hardware that keeps up with todays requirements. So as you can obviously tell from this post, I am not going to convert to TwinCat any time soon. However competition is always healthy and keeps companies like NI on their toes to make sure they are always providing value to their customers so they don't start wandering off to other solutions.
  2. What I would like to see is a way to minimize the map or set constant by double-clicking on it. Like you can do with cluster constants. Hand editing Maps is such a small use-case but useful for some I guess. You usually work with these things programmatically. Thanks for the tool.
  3. i get it. Sometimes you gotta patch and move on. In parallel, though, I'd open a ticket with NI to get them to spend some resources on worrying about your problem too.
  4. I made a workaround. Instead of typdefing the entire Map, I just typedef the datatype inside the Map. This is probably the best way to do it anyway. Similar to typedefing an array element vs the entire array. Regardless, however, this is still a bug. Thanks for looking into it.
  5. Ok this is so random how I found this, but I was able to reproduce it. I'm hoping others on here can reproduce it as well. If this is a known issue then please link to the source for reference. I've attached code that reproduces it. But basically in order to reproduce it: Place a Map datatype in the private class data of the class. Make the Map datatype a typedef. lava.zip
  6. I'm noticing that when I probe LV class wires in LabVIEW 2019 SP1, it won't display any data on the probe. In fact it appears as if that part of the code never executed. I just can't figure out how to reproduce it. I'm wondering if anyone else has noticed this behavior. I can clearly see that I have multiple probes on one diagram and the probe attached to a class wired shows "Not Executed" in the probe watch window, even though "Retain Wire Values" is enabled. LabVIEW restart does not fix it. classprobe.mp4
  7. It was not developed in a bubble. There was already a close relationship between the VIPM team and NI. So they knew the requirements and the need. It's been over 5 years since the release of NIPM, so not much movement however... That might have been the first step, but hardly the long-term plan. In reality, I think the main reason for lack of development on NIPM is that people got shuffled around and new management came in and the original development plans for NIPM got put on a shelf. I can tell you that NI had the goal of full VIPM replacement. There is much churn internally at NI on where to take NIPM next. They are back to wanting to add features to facilitate better reuse library installation for current LabVIEW (how to achieve this is not clear). For sure however, this is the clear case with NXG. I suggest you watch this video:
  8. NI caused this problem themselves. NIPM should have provided all the features of VIPM from the start and then added GPM features. Lack of investment. Now they're playing catchup, but technology is moving on.
  9. Well, it seems to be more of a back, than a front. GitHub is becoming the package repo. GitHub is not making a front end manager. Agree this would be hard or impossible to achieve. Ok, ya this is a more attainable goal and a possible path forward. But GPM is not widely adopted and has limitations. NIPM will win by default, because it is made and fully supported by NI. However, it has a long way to go to support all the features we (as developers) need. It's a dumb installer and has no smarts related to the LabVIEW environment. For example, you cannot target LabVIEW by version and cannot allow up compiling of code. NI is investing resources to make it better over time and we need to keep pushing them to add features we need. However, NI moves like a big company does, very slow. There are currently 3 package formats: VIPM, NIPM and GPM. It's a mess, and by reading this thread, it seems people are open to ditching packages all together.
  10. I feel this is somehow related. GitHub supports packages as described here: https://help.github.com/en/github/managing-packages-with-github-packages Do you think this is something that we can utilize for LabVIEW package distribution?
  11. GPM extracts code into your project, BTW. VIPM, NIPM and GPM provide built products of reusable code, among other things. I believe what you are saying is that you consider existing, cloud-based source code control tools such as GIT as a viable option for reusable code distribution. This shouldn't be limited to packaged stuff.
  12. I agree. I don't store those type of file in the repo for a while now. My specific comment was about how to transition from one large files extension in Mercurial to Git. I use Bitbucket Downloads. However, I'm finding that different file sharing tools can be useful for different use-cases. There's really no one-size fits all.
  13. I sign all my built EXEs, for all my customers. It's trivial to do and doesn't cost much. This also allows me to know if the application that I'm asked to support was built by my company or the customer did the rebuild themselves.
  14. What's the reason for moving to Github? I was using Kiln to host my code, which used Mercurial. I switched to Bitbucket with Git. I had to migrate many projects from Mercurial to Git. I did the migration using the HG-GIT extension: https://hg-git.github.io The main problem I encountered was using the "large files" Mercurial extension. The automatic import tools provided by GitHub and others don't like that extension and barf. Otherwise you can use the web migration tool provided by GitHub. My workaround to the large files problem was to accept the fact that I will lose the revision history on those files. Not a huge deal, since most large files I had were installers and binaries that didn't have revisions per say. So after the migration I did a diff on the project folder, identified the large files, which were missing from the new Git project and just copied them over and pushed them up to the Git repo. Don't forget that the core GIT functionality is the same regardless of service provider. So let's say you found a way to import your repo to GitHub, you can easily turn-around and move it to GitLab, bitbucket or whatever. You're not locked-in. But it might be an issue if you are using an extension that only one service provider supports. The future is GIT. I made the jump over a year ago and haven't looked back. The service provider you choose should give you the tools you need to do your job. The reason I picked Bitbucket was because I liked and used all the other products that Atlassian provides, JIRA, Confluence etc. The tools from Github are a bit weak for my business. I also like companies that continuously improve and invest in their products, among other things. Github seems to be the popular choice for open-source, since that's how it got started. Now that Microsoft owns them, perhaps they don't have to worry about generating revenue (don't know if it's good or bad). But I don't see features that compare to what Atlassian offers.
  15. Published 10/29/2019 See here: https://finance.yahoo.com/news/national-instruments-announces-plan-ceo-200200768.html AUSTIN, Texas--(BUSINESS WIRE)-- NI (NATI) today announced that Alex Davern will step down as Chief Executive Officer of NI, effective January 31, 2020. The NI Board of Directors has appointed current President and COO, Eric Starkloff, as NI President and CEO, effective February 1, 2020. Davern will take up a teaching position at the University of Texas McCombs School of Business starting in the Fall of 2020. Davern will remain on staff at NI as strategic advisor to the CEO through May and will continue to serve on the NI Board of Directors. Board Chairman Michael McGrath said, “The board appointed Alex as CEO in 2016 to lead the transition from our founder, Dr. James Truchard. Over the past three years, he led NI and shaped a new core strategic vision, expanded our strategy to provide more complete systems for our customers, aligned the company to focus on growth industries and delivered record results. The board’s intention, after a successful transition from the founder, was to appoint the next CEO to lead the company to achieve this new vision. After considering alternatives, we unanimously selected Eric as our next CEO to lead NI into a very promising future. Alex will leave NI stronger, with experienced leaders and a clear strategy. We are excited to have Eric as our new leader as he has proven to the board that he is the most qualified person to take NI to the next level.” Davern, CEO said, “I have thoroughly enjoyed being part of NI’s incredible success since joining in 1994, one year before the IPO, and I am confident the company is well positioned to deliver on its growth strategy. I am proud of the progress our employees made in significantly improving our operating results and we have developed a team of highly experienced leaders. I have worked with Eric for many years and have great confidence that as CEO, he will continue to take NI forward to realize the company’s long-term potential.” Starkloff, President and COO said, “It has been an honor to work alongside Alex for the past 22 years and I want to thank him for his mentorship and his significant contributions to NI. I am confident in our strategy and our team, and I believe we are in a position of strength to deliver on our goals. I look forward to taking on the responsibility of CEO, as we connect our deep engineering experience and software-connected systems with our incredible customers who are taking on the complex challenges shaping humanity.” This leadership transition will be discussed during the Q3 2019 earnings call today at 4:00 p.m. CDT.
  16. Does this mean the Actor Framework functionality will now be represented graphically in NXG, as apposed to a list of hundreds of class VIs in a tree in the project?
  17. Just watched this presentation by Richard Feldman called: Why Isn't Functional Programming the Norm. As I was watching this, many ideas came to mind about how LabVIEW stacks up in various areas of the presentation. I wanted to hear what the community thinks about this. We can all agree that LabVIEW is NOT a popular language (as defined in the video) and it probably will not end up on any presentation as the one in this video (I desire for this to change though). However, I think the discussion in the community about FP vs OO is currently taking place. I know people that do not use OO in LabVIEW and many that swear by it. So I think this is a fitting discussion. However, the core question of the presentation as put by Richard is "Does OO features make a language popular?" His argument is NO. I don't think OO by itself will make LabVIEW popular, but where does LabVIEW end up on the reasons for popularity as presented? Or better yet, what can make LabVIEW more popular? Is that something that anyone should care about?
  18. @Jim Kring, it seems to me that the export of the code has gotten a positive response from the community. However I may be wrong. If anyone has any opinion either way, please come forward. As you can see in this thread, it appears the community has rallied around this effort. This is why I emailed you to come here and share your thoughts. In the past, OpenG was a great venue to showcase how a bunch of passionate LabVIEW users can come together and collaborate on something useful. The passion is clearly still there, as shown by the numerous discussions here. The general coding community has moved to Git with GiHub being the hub. This seems like the logical next step. Who knows what this initiative will lead to. However, I’m expecting that placing OpenG in a neutral GitHub repo will provide the spark and the tools to facilitate open collaboration, then the community can drive the future. The community is full of smart people who have a desire for clean tested code. And if issues come up, LAVA discussions (or GitHub issues) are there to hash things out. When LAVA offered to host all OpenG discussions back in 2011. it was clear that the community wanted to help. When @jgcode put his standards together for how code should be discussed at that time, It was an exciting time. Since then, many people have come forward with offers to add new code into OpenG and fix bugs. For example @drjdpowell first offered to include his awesome SQLite toolkit for inclusion into OpenG. He got no response either way. It’s a shame to have a platform and forums to allow people to post and discuss OpenG code and then ignore it. If you have ideas on what the future of OpenG is. I’m hoping it’s to be more transparent and inclusive. Providing the tools, resources and some safety checks along the way, is the best way to facilitate passionate individuals to dive in. Do you think keeping the status quo of the past 10 years makes sense? It seems to me that the community disagrees. What do you think?
  19. I can see why you asked that question. It seemed to come out of left field. The intent is not to fork OpenG or create a separate development branch. The intent is to facilitate community involvement and incorporate some of the ideas floating around here on LAVA into new future builds of OpenG. JKI has taken some of the libraries, not most. I counted 5 on JKI and there are around 23 that I found and migrated. Granted, some of them could possibly be merged into new or cleaned up. However, this can be decided by the community. I think having a location that the OpenG community can call their own is important. Including documentation on how to contribute and providing an open, inclusive, transparent development and deploy process is part of it. Having them on GCentral was my idea. I didn't get official approval from GCentral. I've since moved (and renamed) the repos from where I had them to a dedicated OpenG organization. This way it would truly be separate. If you want me to add you as a maintainer, let me know. If there are any contributions to OpenG that JKI has made on the JKI branch, we should merge them into the new repos. New location: https://github.com/Open-G
  20. I'm glad to hear that you are welcoming to this. I think this initiative and the positive response shows that the community is willing to take on some of these tasks and responsibilities as long as there's an open and inclusive process in place. As a starter, it would be great to revive the openg.org domain which has been dead for many years and at least point it to the LAVA OpenG root forums,. That's where it was pointing to before as indicated by this post. In the future, it could point to a GitHub landing page.
  21. It seems like OpenG needs a landing page. Perhaps if we use GitHub Pages for the GitHub repos, as proposed here. This could be the new URL to include in licensing.
  22. Good points. You can't create a wiki for an organization. I've setup an OpenG team, but even there, no wiki support. Limitations... However, each repo has its own wiki. Which is good for content related to that specific repo. It's possible now with separate repos, to have each one have it's own LV version and build/release process if needed. However, if not, then each wiki can simply have a link to some global page that covers the overarching policies. This could be on LAVA, but I think the LabVIEW Wiki could provide support here. There could be a single page there covering everything, which is also community editable. Creating issues can be done by anyone. So if you want to do that, go for it. If they are invalid, they can be closed later. We should also search here son LAVA and create issues for stuff reported by people in the past.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.