Jump to content

Recommended Posts

This maybe has been ask/answered/discuss before but here it is.

 

 

Is there any VIPM alternative out there?

 

If not, It will be very interesting to start a discussion whether an open source version is feasible and how many members in the community are willing to work on it.

 

I know JKI has a great product already but seems like development has come to halt wrt VIPM.

 

Any comments?

 

 

Share this post


Link to post
Share on other sites

The VIPM file format is somewhat open.  The VIP, VIPC, and OGP files are just zip files, if you rename them zip you can see the file structure.  You can unzip these files and in them is a spec file that is plain text which describes the version, the files, where to put them, dependencies, etc.  Also in the zip is the actual files organized by groups, and an icon.  Long ago I wrote a tool for installing VIP, and OGP files without VIPM, but honestly I saw it as a wasted effort for installing files.  VIPM is pretty solid, and it is installed with LabVIEW in most situations.  Besides VIPM does other configuration management things like installing dependencies, and checking for compatibility.

 

I agree that VIPM development seems to have slowed down, but what more do you want from it?  I'm not a super user of it, but it seems to have all the features I've wanted, and has been stable for several versions.

Share this post


Link to post
Share on other sites

I've considered writing one, however, i have a different philosophy on dependencies and prefer all dependencies not native to labview live underneath the main project file instead of sharing files across projects within user.lib and vi.lib.  Pallets would be stored in the user libraries section of the pallet viewer and all pallets would be re-written to point back to the project specific dependencies directors instead of user.lib and vi.lib.  Something feels really wrong having to rewrite program files directories to maintain dependencies.  

 

I've also considered using NPM, which is the nodejs package manager, to maintain labview packages instead of vipm in combination with the approach above. This a little more involved as i need to write a script to handle different types of packages and it requires packages to have a package.json instead of a "spec" file inside of them.  So there would be additional work with that.

 

Anyway, i haven't started anything yet, but i'm also curious what else it out there.  

  • Like 1

Share this post


Link to post
Share on other sites

Something feels really wrong having to rewrite program files directories to maintain dependencies.  

 

I've also considered using NPM, which is the nodejs package manager, to maintain labview packages instead of vipm in combination with the approach above. This a little more involved as i need to write a script to handle different types of packages and it requires packages to have a package.json instead of a "spec" file inside of them.  So there would be additional work with that.

 

I saw a conversation (argument) about that at some point but I don't recall the conclusion. I think there are arguments for a global install as well as a local install, depending on the tool. For example I'm assuming nobody is arguing this should be project-specific.

 

On the other part (NPM) I thought it was an interesting idea and did some looking around. Chocolatey (despite its stupid name) seems to be relatively popular and the spec file is fundamentally the same as the vipm spec file with some added features (like wildcards):

https://github.com/chocolatey/chocolateytemplates/blob/master/_templates/chocolatey/__NAME__.nuspec

 

And then I stumbled across this thing called "OneGet" which is actually included in windows 10 (under a different name, PackageManagement). Its basically a manager of other package managers and includes (a) a chocolatey implementation and (b) an API for adding other package managers. It would require a little bit of c# coding but in theory you could wrap up the VIPM API and call it from OneGet.

 

I don't have any real conclusion except it seems like theres plenty of options out there. The challenge is some of the lv idiosyncrasies like palettes and linker issues within the files which is what vipm handles for you (most of the time)...

Share this post


Link to post
Share on other sites

I saw a conversation (argument) about that at some point but I don't recall the conclusion. I think there are arguments for a global install as well as a local install, depending on the tool. For example I'm assuming nobody is arguing this should be project-specific.

 

I believe that the "wrongness" that @odoylerules mentioned isn't so much about having a global install, but rather having packages added to C:\Program Files\ which goes against current Windows security principles.

 

C:\Program Files\ is a place of restricted security, mainly for an official installer to place files required to run a program, like .exes, DLLs, and official resources. User-modifiable and 3rd-party files (e.g. config files, example code, 3rd party development libraries, etc.) would ideally go into somewhere like C:\ProgramData\National Instruments\LabVIEW 2015\user.lib\

Share this post


Link to post
Share on other sites

 

 

I believe that the "wrongness" that @odoylerules mentioned isn't so much about having a global install, but rather having packages added toC:\Program Files\ which goes against current Windows security principles.
 

 

Well partly yes, i do have an issues with this.  However, i also have an issue with global dependencies and honestly there may not be away around that based on how labview was designed.  I don't have as much experience with Labview as a lot of people here so i'm sure there are lots of use cases where it might be needed.  

 

However, i think there are two types of labview "packages".  One would be your IDE extensions such as the G-code manager that you linked.  These types of packages work well with VIPM and are good for extending the functionality of the labview IDE.  However, i think project specific dependencies, such as OpenG, should live in the subfolder of the project.  

 

This would basically mean having a separate copy of OpenG for every project you are working on.  Personally i think this makes sharing code a lot easier than running a VIPC file every time you change projects.  It compartmentalizes project specific dependencies away from the IDE/Program Files location and into a specific place.  It also prevents one project from touching the dependencies of another.

 

So there may be more downsides than not, i'm not sure, but i think its a better approach than global dependencies.  

Edited by odoylerules
  • Like 1

Share this post


Link to post
Share on other sites

 

This would basically mean having a separate copy of OpenG for every project you are working on.  Personally i think this makes sharing code a lot easier than running a VIPC file every time you change projects.  It compartmentalizes project specific dependencies away from the IDE/Program Files location and into a specific place.  It also prevents one project from touching the dependencies of another.

 

So there may be more downsides than not, i'm not sure, but i think its a better approach than global dependencies.  

 

I like the idea of having separate project based configurations stored on the local machine and being able to switch quickly between them.  Would there be a way to "trick" VIPM/LabVIEW  to point to "project specific" user.lib / vi.lib directories using symbolic links?

Share this post


Link to post
Share on other sites

Well partly yes, i do have an issues with this.  However, i also have an issue with global dependencies and honestly there may not be away around that based on how labview was designed.  I don't have as much experience with Labview as a lot of people here so i'm sure there are lots of use cases where it might be needed.  

 

However, i think there are two types of labview "packages".  One would be your IDE extensions such as the G-code manager that you linked.  These types of packages work well with VIPM and are good for extending the functionality of the labview IDE.  However, i think project specific dependencies, such as OpenG, should live in the subfolder of the project.  

 

This would basically mean having a separate copy of OpenG for every project you are working on.  Personally i think this makes sharing code a lot easier than running a VIPC file every time you change projects.  It compartmentalizes project specific dependencies away from the IDE/Program Files location and into a specific place.  It also prevents one project from touching the dependencies of another.

 

So there may be more downsides than not, i'm not sure, but i think its a better approach than global dependencies.

This is really a source code control issue rather than an installer issue. The two are related in that an installer usually installs a particular dependency version but the goals are different. In NIs refusal to give us a proper source control system that is fit for purpose, the VIPM software is the best of a bad situation for simple version control but it isn't really a version control tool any more than an RPM package installer is (on which it is based),

Share this post


Link to post
Share on other sites

This is really a source code control issue rather than an installer issue. The two are related in that an installer usually installs a particular dependency version but the goals are different. In NIs refusal to give us a proper source control system that is fit for purpose, the VIPM software is the best of a bad situation for simple version control but it isn't really a version control tool any more than an RPM package installer is (on which it is based),

 

Having been one of the initial codevelopers of the ogp file format and spec file, I wasn't really looking at RPM good enough to say that it was based on it, but we did take inspiration from the general RPM idea and tried to come up with something that worked for LabVIEW. The format itself was in some ways more based on the old application builder configuration file than any specific package manager format.

 

As to creating an alternative to VIPM, I would think this to be almost an effort in vain. The OpenG Commander was a good thing back in the pre LabVIEW 8 days and worked fairly well for that situation but the new project, LabVIEW libraries and classes and various other file formats introduced with LabVIEW 8.0 and later really are a very different breed in many ways. Also VIPM really is a combination of the Open G Commander, Open G Package Builder and the Open G Application Builder, but then severly enhanced to handle the new file types too, which is a very complicated process and requires quite a few undocumented VI server methods, and some of them changed between versions, so it's hard to support more than two or three LabVIEW versions at all.

 

As to saying that NI doesn't provide a good source code control solution is kind of going into the wrong direction. I haven't seen many software developer tools coming with source code control from the same manufacturer that actually work good in any way. Microsoft couldn't pull that trick and there is nothing that makes me believe that NI has even nearly as many resources available than MS.

 

There are ways to use source code control with LabVIEW. Not ideal ones but there aren't any ideal solutions as far as I'm concerned even outside of LabVIEW. LabVIEW has a few extra hurt points as some of its files are big nasty binary blobs, that none of the source code control tools can handle efficiently but all of the modern ones can at least handle them. The more modern XML based files while being text based have a problem in that just about every source code control system will not be able to handle them consistently as simple text based merging is not enough. And context based merging is still in its infancy and doesn't even work well for other XML based files with fully known schema. But to turn around and requiring the LabVIEW files to be in a format that can be easily merged automatically is also not really realistic.

Share this post


Link to post
Share on other sites

As to creating an alternative to VIPM, I would think this to be almost an effort in vain. The OpenG Commander was a good thing back in the pre LabVIEW 8 days and worked fairly well for that situation but the new project, LabVIEW libraries and classes and various other file formats introduced with LabVIEW 8.0 and later really are a very different breed in many ways. Also VIPM really is a combination of the Open G Commander, Open G Package Builder and the Open G Application Builder, but then severly enhanced to handle the new file types too, which is a very complicated process and requires quite a few undocumented VI server methods, and some of them changed between versions, so it's hard to support more than two or three LabVIEW versions at all.

I only half agree with this. To replicate VIPM, yes. To make an installer, no. (Are you still using a modified the modified OpenG Pakager)

 

As you are probably aware, I use a Wizard style installer for SQLite API for LabVIEW because VIPM can't handle selective 32&64 bit installs. It can also integrate  into a global "package Manager" which is SQLite based rather than the "spec" inifile . In fact. It was the reason I added the capability to import an ini file to the SQLite API for LabVIEW so as to import the VIPM spec files ;) The downside to the wizard is the size of the deployment but it suffices for isolated installs and it can be retrospectively integrated to the global manager. The upside, however, is that, in addition to the standard pages, you can add custom pages for your install and that makes it much more flexible-if a little more effort.

 

So the wizard has all the bits and pieces to install (pallet menus, recompiling etc) and I did start to make a Wizard Creator where the Installer was an example. Just one of the many half finished projects.

 

The global manager (which is the other side of VIPM) would take a lot to productionise and I'm, not currently interested in stepping on anyone toes. It works great for me and I don't have to produce help documentation which takes me longer than writing software :P. It will ultimately also handle my own licencing scheme too (which we have discussed previously)  but even then it will be limited to only my products as there are a few now.

 

So yes. It is totally doable (and has been done) with enough effort but I would do it slightly differently to VIPM.

Share this post


Link to post
Share on other sites

I only half agree with this. To replicate VIPM, yes. To make an installer, no. (Are you still using a modified the modified OpenG Pakager)

 

VIPM can't handle selective 32&64 bit installs. It can also integrate  into a global "package Manager" which is SQLite based rather than the "spec" inifile .

 

Yes I'm using a somewhat enhanced OpenG Packager version for my own work. It doesn't support many things I would like but I haven't really needed them so far. But it allows simple support for selective installs. It doesn't support relocating, relinking and password protecting Vis, as that is really part of what the OpenG Application Builder was about, only that part is very much out of date as it doesn't support 8.0 and newer file types.

Share this post


Link to post
Share on other sites

I suggest changing your workflow.

 

No thanks. That's putting the cart before the horse. Software works *for* me to make *my* workflow easier/simpler/productive. :D If some software doesn't work for me then I'll write some that does - even in another language if the fancy takes me :P

 

Are you really suggesting that every project should be in it's own VM? :o That would mean I would have  523 VMs (and counting) just for Windows without even considering Linux, Mac, RT, VXWorks or test machines . :throwpc: 

  • Like 1

Share this post


Link to post
Share on other sites
13 hours ago, ShaunR said:

Are you really suggesting that every project should be in it's own VM? :o That would mean I would have  523 VMs (and counting) just for Windows without even considering Linux, Mac, RT, VXWorks or test machines . :throwpc:

It's the workflow I use and it works for me. If I'm juggling 523 projects then perhaps my workflow would be different. I currently have about 10 active projects but maybe 3-4 hot ones. Lots of hard drive space and most of it SSDs. So it's fine. As far as test VMs, you just need one for each OS. It's a judgement call. Perhaps you have one VM that you use for several projects that use the same LV version and reuse libraries - who knows. How do you handle different NI hardware driver versions that come out every 6 months?

We can agree that LabVIEW sucks in this regard. No news there. Using VMs is "Putting the cart before the horse". Sure, but I'm done with creating large scale infrastructure tools to fix some other company's inactivity. Some folks younger than me with more energy to spare can do that - I'm good. I'd rather spend the development effort on my customers, which gives me greater reward and returns. Both emotionally and financially.

Don't get me wrong. I still build tools and reuse libraries. But most likely not a VIPM-type of tool or add-on.

  • Like 1

Share this post


Link to post
Share on other sites

Michael, one of the biggest problems for me with using a VM per project is how to do you manage physical offsite backup of VMs?

Share this post


Link to post
Share on other sites
6 minutes ago, Neil Pate said:

Michael, one of the biggest problems for me with using a VM per project is how to do you manage physical offsite backup of VMs?

How do you manage physical offsite backup of your PC? With VMs it's actually much easier since they're just files. On a Mac, which I use, each VM looks like a single file. I even run VMs on external hard drives which makes it much easier to move around and backup.

Do you use a cloud backup service?

This is probably way off-topic.

Share this post


Link to post
Share on other sites

Off-topic indeed, but I would love to chat about it ;)

I generally don't back up my "PC" i.e. the OS etc. I know I should, just never get around to it. All my dev stuff is in Bitbucket (so offsite) and a NAS on site. I try to not go more than a day without pushing my changes out to these backup sites. I use VMs for old versions of LabVIEW and for testing purposes.

(Even more offtopic, I use CrashPlan offsite backup for my personal stuff like photos and videos etc)

In London I was pretty happy with 150 Mbps broadband and still that is way to slow to regularly upload multi-gig VMs. I never got sophisticated enough to do things like rsync with my VMs, so they just ended up being stored locally on a big mechanical HDD. Now that I am living where broadband is not so good (1 Mbps) I really really cannot synch VMs.

My question is just this; how do you synch your VMs offsite? Is there some clever stuff going on in OS X that does diffs and stuff to a VM container so it only uploads the small changes?

Share this post


Link to post
Share on other sites
10 hours ago, Michael Aivaliotis said:

It's the workflow I use and it works for me. If I'm juggling 523 projects then perhaps my workflow would be different. I currently have about 10 active projects but maybe 3-4 hot ones. Lots of hard drive space and most of it SSDs. So it's fine. As far as test VMs, you just need one for each OS. It's a judgement call. Perhaps you have one VM that you use for several projects that use the same LV version and reuse libraries - who knows. How do you handle different NI hardware driver versions that come out every 6 months?

We can agree that LabVIEW sucks in this regard. No news there. Using VMs is "Putting the cart before the horse". Sure, but I'm done with creating large scale infrastructure tools to fix some other company's inactivity. Some folks younger than me with more energy to spare can do that - I'm good. I'd rather spend the development effort on my customers, which gives me greater reward and returns. Both emotionally and financially.

Don't get me wrong. I still build tools and reuse libraries. But most likely not a VIPM-type of tool or add-on.

You are really talking per customer project (AKA per order), rather than projects. I envisaged a VM for every LabVIEW project file, however trivial, so even something like the Spell Check that I recently created would have had a VM-which would be silly. A VM per customer project doesn't resolve versioning updates. It just compartmentalises and freezes the customer software as a deployment. When you up issue a component in the customers software or fix a bug do you then create a new VM? I doubt it. You probably use the snapshot feature as a version control for that customers entire system..

Whilst a VM is a good method for snapshotting, VIPM is an installer and mainly for toolkits rather than entire projects. So it's not really solving the same problems. When you create a VM you use the NI installer for LabVIEW, Right?. , After that, how do you get all your toolkits on there? VIPM, right? Once on there you then write your solution and then snapshot the VM so you can reconstitute it as a whole right down to the specialist instrument drivers that were written in China by a blind man with a lisp.:) But that VM can only be used for that customers project otherwise you get into huge problems with licencing and copyright.

Most of the software toolkits and helper apps that I have written were not intended to be written as such. I have a lot of stuff I have written over the years - snippets, scripts and miscellaneous programs - and much you see on my website is either just productionising something I wrote an age ago or grouping a number of already written snippets into something more friendly. This is the case with the Project Probe and the Install Manager. I had all the bits and pieces but no easy interface for them and got tired of opening 10 projects and running them manually from the dev environment. The Wizard was a little different in that it started as a cross platform installer when VIPM was windows only. So that was a definite need but today, for this thread, it is already written with many of the bits and pieces that would be necessary.

Edited by ShaunR

Share this post


Link to post
Share on other sites

What we have settled on doing is moving non-native libraries into our project directory (which is under source code control).  I've just worked on too many legacy projects with missing dependencies. 

I had one recently where the original dev company was out of business and the company that wrote the library (for their hardware) was out of business.  I wound up having to redo a communication module to a different piece of hardware over about 40 hours which should have been less than 2 to reinstall the source and compile for a new machine.  More business for me, I guess, but there have been enough other occasions where I, or some other colleague, has cost me hours of dependency hunting that I find it's more than worth it to keep a project as encapsulated and portable as possible.  And we don't even do that much multi-developer work, I imagine it would be even more important then.  I would love to have a way to have a "project.lib" directory to support palette menus out of local directories, but at the end of the day it's not a great loss since I'm using quickdrop 95% of the time anyway.

I tried using VM's a couple of years ago, especially for older legacy code from LV6 & 7, but the performance wound up getting so abysmal.  Maybe it was my setup, but we had sprung for VMWare and had CPU's with the VT feature and had tons of ROM, but it was still just appallingly slow. This was before I had an SSD, so maybe that would have helped, maybe I'll revisit it if any of our old legacy projects wake back up soon.  Oh, and OS licensing.  My reading on it at the time lead me to believe that if you weren't a Microsoft Certified Partner (who get unlimited installs based on their license) you were technically supposed to pay for an OS license for each separate VM.  It looks like that might have changed with Windows 10 "Software Assurance" licensing, but I haven't done any real research on that ...

Mike

Share this post


Link to post
Share on other sites

Even as Microsoft Certified Partner you do (did?) not have unlimited license allowance. After new versions of software are out, you are supposed to upgrade to the newest version within one year. The licenses for older versions get then invalid and with that any VM image using them, even if it is just a backup.

Edited by rolfk

Share this post


Link to post
Share on other sites

Can anyone give a link to Amazon or CDW to purchase the Microsoft Windows 7 license they run on their Virtual Machines (for a single developer/machine)?:frusty:  You will be my hero and I will owe you multiple rounds of beers.  Or do people develop on Win7 VM's in the cloud on Rackspace or AWS?

Or a tutorial about how to accomplish developer VMs / snapshots with a Windows 7 license?  Added bonus LabVIEW version snapshotting/licensing.   

I gave up on Windows VMs after getting stuck in an infinite loop entering the Microsoft Licensing Vortex :throwpc:.  

 

Share this post


Link to post
Share on other sites

Hi bbean,

The only cost effective and legitimate way we've found to run Windows on development VMs is through an MSDN account.  Purchasing info:

https://www.visualstudio.com/products/msdn-platforms-vs

That page also says:

"Visual Studio and MSDN licensing
For an overview of the Visual Studio 2013 product line, including MSDN subscriptions, and the licensing requirements for those products in common deployment scenarios, download the Visual Studio 2013 and MSDN Licensing white paper."

And, according to that white paper (i.e. "Visual Studio 2013 and MSDN Licensing Whitepaper - January-2015", page 14):

"User Licensing
Licensed for Design, Development, Testing, and Demonstrating Your Programs
All MSDN subscriptions and Visual Studio Professional are licensed on a per-user basis. Each licensed user may install and use the software on any number of devices to design, develop, test, and demonstrate their programs. MSDN subscriptions also allow the licensed user to evaluate the software and to simulate customer environments in order to diagnose issues related to your programs. Each additional person who uses the software in this way must also have a license.
What Software is Included and Downgrade Rights
For MSDN subscriptions, the software that is included is defined as any software that is available to the subscriber via MSDN Subscriber Downloads"

 

  • Like 1

Share this post


Link to post
Share on other sites

Bean,

Thanks for the response.  I clicked your link and then went down the black hole of MS again trying to find a reseller for the license.  Doing a google search resulted in this possible link to purchase:

Microsoft MSDN Platforms License & Software Assurance 1

Is that a similar license to what you are using?  

I may be hijacking the thread at this point.

Share this post


Link to post
Share on other sites

Bean,

Yesterday when I checked the "Find a Reseller" list, SWI was listed but the link was dead.  Today, they're not in the list.  Maybe because SWI has been acquired by Crayon:

http://www.crayon.com/en/news-and-resources/crayon-acquires-software-wholesale-international/

According to an internal purchasing note, we bought the "MICROSOFT MSDN OPERATING SYSTEM LICENSE AND SOFTWARE" (not sure if that's what SWI called it) from SWI:

https://www.software-intl.com

Maybe give them or Crayon a call?  The unit price at the time was $450, which I'm thinking is per person per year.

-----------

Aside:  I'd like to know from others who advocate VMs for development whether they are paying the annual fees (ultimately to Microsoft) to be legally compliant, and if so, through what means (e.g. MSDN)?

-Bean

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.