Jump to content

Michael Aivaliotis

Administrators
  • Content Count

    6,086
  • Joined

  • Last visited

  • Days Won

    84

Michael Aivaliotis last won the day on February 10

Michael Aivaliotis had the most liked content!

Community Reputation

325

1 Follower

About Michael Aivaliotis

  • Rank
    MindFreak
  • Birthday 04/02/1968

Profile Information

  • Gender
    Male

Contact Methods

LabVIEW Information

  • Version
    LabVIEW 2018
  • Since
    1995

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. I made a workaround. Instead of typdefing the entire Map, I just typedef the datatype inside the Map. This is probably the best way to do it anyway. Similar to typedefing an array element vs the entire array. Regardless, however, this is still a bug. Thanks for looking into it.
  2. Ok this is so random how I found this, but I was able to reproduce it. I'm hoping others on here can reproduce it as well. If this is a known issue then please link to the source for reference. I've attached code that reproduces it. But basically in order to reproduce it: Place a Map datatype in the private class data of the class. Make the Map datatype a typedef. lava.zip
  3. I'm noticing that when I probe LV class wires in LabVIEW 2019 SP1, it won't display any data on the probe. In fact it appears as if that part of the code never executed. I just can't figure out how to reproduce it. I'm wondering if anyone else has noticed this behavior. I can clearly see that I have multiple probes on one diagram and the probe attached to a class wired shows "Not Executed" in the probe watch window, even though "Retain Wire Values" is enabled. LabVIEW restart does not fix it. classprobe.mp4
  4. It was not developed in a bubble. There was already a close relationship between the VIPM team and NI. So they knew the requirements and the need. It's been over 5 years since the release of NIPM, so not much movement however... That might have been the first step, but hardly the long-term plan. In reality, I think the main reason for lack of development on NIPM is that people got shuffled around and new management came in and the original development plans for NIPM got put on a shelf. I can tell you that NI had the goal of full VIPM replacement. There is much churn internally at NI on where to take NIPM next. They are back to wanting to add features to facilitate better reuse library installation for current LabVIEW (how to achieve this is not clear). For sure however, this is the clear case with NXG. I suggest you watch this video:
  5. NI caused this problem themselves. NIPM should have provided all the features of VIPM from the start and then added GPM features. Lack of investment. Now they're playing catchup, but technology is moving on.
  6. Well, it seems to be more of a back, than a front. GitHub is becoming the package repo. GitHub is not making a front end manager. Agree this would be hard or impossible to achieve. Ok, ya this is a more attainable goal and a possible path forward. But GPM is not widely adopted and has limitations. NIPM will win by default, because it is made and fully supported by NI. However, it has a long way to go to support all the features we (as developers) need. It's a dumb installer and has no smarts related to the LabVIEW environment. For example, you cannot target LabVIEW by version and cannot allow up compiling of code. NI is investing resources to make it better over time and we need to keep pushing them to add features we need. However, NI moves like a big company does, very slow. There are currently 3 package formats: VIPM, NIPM and GPM. It's a mess, and by reading this thread, it seems people are open to ditching packages all together.
  7. I feel this is somehow related. GitHub supports packages as described here: https://help.github.com/en/github/managing-packages-with-github-packages Do you think this is something that we can utilize for LabVIEW package distribution?
  8. GPM extracts code into your project, BTW. VIPM, NIPM and GPM provide built products of reusable code, among other things. I believe what you are saying is that you consider existing, cloud-based source code control tools such as GIT as a viable option for reusable code distribution. This shouldn't be limited to packaged stuff.
  9. I agree. I don't store those type of file in the repo for a while now. My specific comment was about how to transition from one large files extension in Mercurial to Git. I use Bitbucket Downloads. However, I'm finding that different file sharing tools can be useful for different use-cases. There's really no one-size fits all.
  10. I sign all my built EXEs, for all my customers. It's trivial to do and doesn't cost much. This also allows me to know if the application that I'm asked to support was built by my company or the customer did the rebuild themselves.
  11. What's the reason for moving to Github? I was using Kiln to host my code, which used Mercurial. I switched to Bitbucket with Git. I had to migrate many projects from Mercurial to Git. I did the migration using the HG-GIT extension: https://hg-git.github.io The main problem I encountered was using the "large files" Mercurial extension. The automatic import tools provided by GitHub and others don't like that extension and barf. Otherwise you can use the web migration tool provided by GitHub. My workaround to the large files problem was to accept the fact that I will lose the revision history on those files. Not a huge deal, since most large files I had were installers and binaries that didn't have revisions per say. So after the migration I did a diff on the project folder, identified the large files, which were missing from the new Git project and just copied them over and pushed them up to the Git repo. Don't forget that the core GIT functionality is the same regardless of service provider. So let's say you found a way to import your repo to GitHub, you can easily turn-around and move it to GitLab, bitbucket or whatever. You're not locked-in. But it might be an issue if you are using an extension that only one service provider supports. The future is GIT. I made the jump over a year ago and haven't looked back. The service provider you choose should give you the tools you need to do your job. The reason I picked Bitbucket was because I liked and used all the other products that Atlassian provides, JIRA, Confluence etc. The tools from Github are a bit weak for my business. I also like companies that continuously improve and invest in their products, among other things. Github seems to be the popular choice for open-source, since that's how it got started. Now that Microsoft owns them, perhaps they don't have to worry about generating revenue (don't know if it's good or bad). But I don't see features that compare to what Atlassian offers.
  12. Published 10/29/2019 See here: https://finance.yahoo.com/news/national-instruments-announces-plan-ceo-200200768.html AUSTIN, Texas--(BUSINESS WIRE)-- NI (NATI) today announced that Alex Davern will step down as Chief Executive Officer of NI, effective January 31, 2020. The NI Board of Directors has appointed current President and COO, Eric Starkloff, as NI President and CEO, effective February 1, 2020. Davern will take up a teaching position at the University of Texas McCombs School of Business starting in the Fall of 2020. Davern will remain on staff at NI as strategic advisor to the CEO through May and will continue to serve on the NI Board of Directors. Board Chairman Michael McGrath said, “The board appointed Alex as CEO in 2016 to lead the transition from our founder, Dr. James Truchard. Over the past three years, he led NI and shaped a new core strategic vision, expanded our strategy to provide more complete systems for our customers, aligned the company to focus on growth industries and delivered record results. The board’s intention, after a successful transition from the founder, was to appoint the next CEO to lead the company to achieve this new vision. After considering alternatives, we unanimously selected Eric as our next CEO to lead NI into a very promising future. Alex will leave NI stronger, with experienced leaders and a clear strategy. We are excited to have Eric as our new leader as he has proven to the board that he is the most qualified person to take NI to the next level.” Davern, CEO said, “I have thoroughly enjoyed being part of NI’s incredible success since joining in 1994, one year before the IPO, and I am confident the company is well positioned to deliver on its growth strategy. I am proud of the progress our employees made in significantly improving our operating results and we have developed a team of highly experienced leaders. I have worked with Eric for many years and have great confidence that as CEO, he will continue to take NI forward to realize the company’s long-term potential.” Starkloff, President and COO said, “It has been an honor to work alongside Alex for the past 22 years and I want to thank him for his mentorship and his significant contributions to NI. I am confident in our strategy and our team, and I believe we are in a position of strength to deliver on our goals. I look forward to taking on the responsibility of CEO, as we connect our deep engineering experience and software-connected systems with our incredible customers who are taking on the complex challenges shaping humanity.” This leadership transition will be discussed during the Q3 2019 earnings call today at 4:00 p.m. CDT.
  13. Does this mean the Actor Framework functionality will now be represented graphically in NXG, as apposed to a list of hundreds of class VIs in a tree in the project?
  14. Just watched this presentation by Richard Feldman called: Why Isn't Functional Programming the Norm. As I was watching this, many ideas came to mind about how LabVIEW stacks up in various areas of the presentation. I wanted to hear what the community thinks about this. We can all agree that LabVIEW is NOT a popular language (as defined in the video) and it probably will not end up on any presentation as the one in this video (I desire for this to change though). However, I think the discussion in the community about FP vs OO is currently taking place. I know people that do not use OO in LabVIEW and many that swear by it. So I think this is a fitting discussion. However, the core question of the presentation as put by Richard is "Does OO features make a language popular?" His argument is NO. I don't think OO by itself will make LabVIEW popular, but where does LabVIEW end up on the reasons for popularity as presented? Or better yet, what can make LabVIEW more popular? Is that something that anyone should care about?
  15. @Jim Kring, it seems to me that the export of the code has gotten a positive response from the community. However I may be wrong. If anyone has any opinion either way, please come forward. As you can see in this thread, it appears the community has rallied around this effort. This is why I emailed you to come here and share your thoughts. In the past, OpenG was a great venue to showcase how a bunch of passionate LabVIEW users can come together and collaborate on something useful. The passion is clearly still there, as shown by the numerous discussions here. The general coding community has moved to Git with GiHub being the hub. This seems like the logical next step. Who knows what this initiative will lead to. However, I’m expecting that placing OpenG in a neutral GitHub repo will provide the spark and the tools to facilitate open collaboration, then the community can drive the future. The community is full of smart people who have a desire for clean tested code. And if issues come up, LAVA discussions (or GitHub issues) are there to hash things out. When LAVA offered to host all OpenG discussions back in 2011. it was clear that the community wanted to help. When @jgcode put his standards together for how code should be discussed at that time, It was an exciting time. Since then, many people have come forward with offers to add new code into OpenG and fix bugs. For example @drjdpowell first offered to include his awesome SQLite toolkit for inclusion into OpenG. He got no response either way. It’s a shame to have a platform and forums to allow people to post and discuss OpenG code and then ignore it. If you have ideas on what the future of OpenG is. I’m hoping it’s to be more transparent and inclusive. Providing the tools, resources and some safety checks along the way, is the best way to facilitate passionate individuals to dive in. Do you think keeping the status quo of the past 10 years makes sense? It seems to me that the community disagrees. What do you think?
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.