Jump to content
vivante

Why does LV take long time to build executables, when many classes are used?

Recommended Posts

Hi,

I posted this question in OOP forum, but may be the right place is here... :)

Have you experienced that LV takes very long time to build executables when you use a certain number of classes? I have noticed that when my app uses many classes (e.g. 30 classes) LabVIEW IDE slows down ("busy pointer" appears very often during VI editing) and compiling executables takes more than half an hour.

It's not due to my machine because larger projects are built in less time if they don't contain so many classes. Are there properties that can speed up the build?

  • Like 1

Share this post


Link to post
Share on other sites

This has become common place in my larger application which also uses classes extensively. I'm not sure if it's the actual use of classes, or the proliferation of VIs as you start to get a bunch of read/write accessors, or perhaps the dependency trees that get mapped out between the various classes. Regardless, any simple edit of a VI is followed by 1 second or so of "busy" cursor during which the IDE locks up for recompiling etc. I've been dealing with this so long I hardly notice anymore, editing VIs just has a built in cadence where I make a change, wait, make a change, wait...

Build times for this application used to take about 45 minutes, but now that 2011 SP1 hit it managed to get down to 15 min or so. I have not noticed a large change in the "wait" duration from SP1 though.

  • Like 1

Share this post


Link to post
Share on other sites

Hmm... We don't see either excessive build times or long delays when editing class methods (except under special circumstances, such as we are making the first edit to just deployed real-time code, but that has nothing to do with classes). The class properties dialog does still take too long to populate.

For the record, I just looked at one of the simpler RT controllers and it has more than 60 classes in the hierarchy. We are using LabVIEW 2011 presently.

(A very special case where we did have problems that did have lots of classes was really an issue with the DSC Module during the build process and we have a workaround for that. I think this is not an issue in 2011 anyway.)

One thought I have is that we use many interfaces and a few implementations of the Factory Method Pattern that together dramatically reduce source code interdependencies. Before we did this deployments on RT in the development environment very often didn't even complete. Now these deployment issues are a thing of the past.

  • Like 1

Share this post


Link to post
Share on other sites

One thought I have is that we use many interfaces and a few implementations of the Factory Method Pattern that together dramatically reduce source code interdependencies.

I tend to agree with this. In the past I have experienced the editing delay mje mentioned; however, it hasn't been an issue for me for quite some time. I like to think it because my coding has gotten better. :lol:

Share this post


Link to post
Share on other sites

I agree with both of you in that a properly engineered application can grow quite large without running into trouble, I have other applications approaching the size of this "larger" application of mine which show no signs of this delay while I edit them. However it still doesn't change the fact that dealing with this older code base is a pain, I'm not about to re-engineer it from the ground up given the time constraints I'm under.

To be quite honest, if developing large applications in LabVIEW requires a near perfect architecture for the IDE to behave well, the IDE has issues. I'm not saying my application is without fault (far from it), but in no way should the blame for poor IDE performance fall at the hands of a particular code base's architecture. The fact that IDE slow downs can be avoided with full adherence good programming practices is irrelevant because you can't always force such practice into a project. The slow downs arise from core design decisions made long ago by NI and it is their job to address it.

Realize this isn't a defensive post: I'm not taking offense to your statements and not trying to provoke a flame war. I'm wholeheartedly on board with large applications development with LabVIEW, but so long as the slow down issue is common, it will be noted by programmers and will continue to contribute to a stigma against LabVIEW for large applications.

Couldn't have said it better myself :)

Applications that I work on are typically in the region of 5,000-10,000 VIs (no classes generally, but a couple have crept in now and again) . Compile times are at most 1 hr and the IDE only slows down on xnodes (like regex). I think if I were to do the same in classes, then you can quadruple the number of VIs and compile times (Anecdotally, compile time "seems" a fairly linear correlation to the number of VIs).

Large applications? Yup. Machine control is about as large as it gets.

Share this post


Link to post
Share on other sites

The slow downs arise from core design decisions made long ago...

My impression is the edit-time slow downs are more tied to the desired user experience rather than Labview's source code design decisions. I can't speak with authority though... it's just my impression. In theory we could give up instantly recompiled vis and a host of conveniences in exchange for snappier editing regardless of source code size or structure. There have been many times when I was ready to do just that. Still, I doubt NI is willing to make that trade off.

Share this post


Link to post
Share on other sites

My observation is that when there are many classes in a project, LV IDE shows the busy cursor as well as the compilation time increases, much more than in projects that have the same number (or higher) of VIs, but without classes. My supposition is that OOP has been innested in LV recently, and LV IDE manifests a different behaviour when we use OOP. What is your opinion about it?

  • Like 1

Share this post


Link to post
Share on other sites

I came across another way of slow down a few years back. I inherited a machine control application that consisted of one main VI going over 10MB size, and a whole bunch of subVIs doing stupidly little. The main VI consisted of several stacked sequence structures with up to 100 frames and almost every frame contained a case structure that was only enabled based on a global string that contained in fact the next state to execute. Basically it was a state machine with the sequence frames consisting a grouping of states and to get to one particular stage LabVIEW had to run through each sequence frame, most of them doing nothing since the case structure inside did not react on the current state. Talk about a state machine turned inside out and upside down and then again some.

Selecting a wire, node or subVI in that diagram was slooooooooooow, as each of these actions caused a several seconds delay. Moving a node was equally slow as each move caused a similar delay. It was a total pain to do anything on that application and it needed work as it had several bugs. I invested several days restructuring the architecture into a real state machine design and placing several logical sub state-machines into their own subVI, also removing about 99% of the several 1000 globals and once I had done that, the whole editing got snappy again. The main VI had been reduced to about 1MB and the different sub statemachine VIs together took mabye another 4MB of disk space. Replacing all the globals with a few shift registers had slashed the disk print of the VIs almost to half. I did make the driver VIs a bit smarter by moving some of the logic into them instead of copying the same operation repeatedly in the main VI but getting rid of the enormous number of sequence frames as well as the many globals that were mostly only a workaround for the not existing shift registers did both get the disk size down a lot as well as make the editing again a useful operation instead of being simply an annoyance.

Surprisingly enough the runtime performance of the original application wasn't really bad, but the redesigned system hardly got ever over 5% CPU usage even when it was busy controlling all the serial devices and motion systems.

How the original programmer ever was able to finish his system with such an architecture and the horrible edit delays for almost every single mouse move I can't imagine. It for sure wasn't a LabVIEW programmer and I later learned that he had only done Visual Basic applications before.

  • Like 1

Share this post


Link to post
Share on other sites

What is your opinion about it?

My opinion is I am completely unqualified to make any kind of assertion as to the cause of the slowdown. I don't have sufficient knowledge of how Labview is implemented under the hood. I personally don't believe it is entirely related to using LVOOP. As I recall, the one project I worked on many years ago that had the most problems with IDE slowdowns didn't use LVOOP. It was developed using procedural methods and implemented as a QSM. I believe the slowdown is mostly related to the user's source code.

To expand a bit on my earlier post, I disagree with mje's argument that the ability to prevent IDE delays using good architectures is "irrelevant because you can't always force such practice into a project." It essentially makes NI solely responsible to fix or prevent problems created by the end user's software implementation. NI is limited in the problems it can "fix" with LV source code changes by finite computing resources and their desired user experience. I don't think it is fair to burden NI with the task of making sure the IDE remains snappy in all situations regardless of the source code we are working on. (And I suspect the restrictions placed on developers to ensure IDE responsiveness would be met with howls of outrage.)

To my eyes the issue of IDE responsiveness is mainly one of knowledge. There are one or more constraints of some sort being violated that is causing the IDE delays. Obviously some of those constraints are not widely known to developers, and they may not be known to NI. When the constraints are not known we cannot design around them, and we end up frustrated by IDE delays that appear with no apparent cause. On the other hand, we develop code within known constraints all the time without demanding fixes from NI. Most developers understand and accept there will be IDE issues in the case of very large VIs like Rolf described. That limitation is a natural consequence of a user experience requiring on-the-fly code compiling. We may not like the delay, but we accept it as being a result of poorly written code, not a flawed IDE.

I agree it is in NI's responsibility to understand the constraints and communicate to us those we are violating when we experience IDE delays. If the constraint is caused by a bug they can fix, great! If it's because we are unintentionally placing unreasonable expectations on the IDE, then it's up to us to adopt practices to work within the constraints (or move to another platform without those constraints.)

Share this post


Link to post
Share on other sites

I agree it is in NI's responsibility to understand the constraints and communicate to us those we are violating when we experience IDE delays. If the constraint is caused by a bug they can fix, great! If it's because we are unintentionally placing unreasonable expectations on the IDE, then it's up to us to adopt practices to work within the constraints (or move to another platform without those constraints.)

[sarcasme]

I recommend assembler for that! It has no restrictions in IDE enforced edit time delays, since it only needs the most simple text editor you can envision. Never mind the rest of the experience. :D

{/sarcasme]

Share this post


Link to post
Share on other sites

It has no restrictions in IDE enforced edit time delays...

Sure, if you can manage to get the IDE to load the code in the first place. There was this one time I was trying to work on a 3 GB assembly source code file... ;)

Share this post


Link to post
Share on other sites

This has become common place in my larger application which also uses classes extensively. I'm not sure if it's the actual use of classes, or the proliferation of VIs as you start to get a bunch of read/write accessors, or perhaps the dependency trees that get mapped out between the various classes. Regardless, any simple edit of a VI is followed by 1 second or so of "busy" cursor during which the IDE locks up for recompiling etc. I've been dealing with this so long I hardly notice anymore, editing VIs just has a built in cadence where I make a change, wait, make a change, wait...

Build times for this application used to take about 45 minutes, but now that 2011 SP1 hit it managed to get down to 15 min or so. I have not noticed a large change in the "wait" duration from SP1 though.

Are you using the property node implementation of the read/write accessors? If so then I believe this slow down is what we have found and reported in CAR 313044 that is published in the known issues. This was a specific bug that wasn't a problem with the overall architecture.

I'll give a little bit more information on what Paul commented on about build time with classes (CAR 316145). With DSC installed we give the user the option of including custom I/O servers in a build. Right now the only way to see if there are custom I/O servers is to load any library within the project to see if it contains a custom I/O server. We were previously loading all libraries (e.g. classes and XControls) to check, only to unload them again. One simple quick workaround (if possible) is to deactivate DSC as we won't check for custom I/O servers if it isn't activated.

Share this post


Link to post
Share on other sites

I have also noticed that building with LV 2010 (32bit) on Windows 7 / 64 bit takes a huge amount of time, if I move the project on a older PC with Windows 7 / 32bit, build time is many times shorter. Have you have the same experience?

Share this post


Link to post
Share on other sites

Are the installations the exact same on both machines? As I mentioned before if you have DSC activated we do an additional check that requires loading libraries. There could be other things that I'm not aware of but that is definitely one possibility.

Another thing to look at is processing power of a single core on the machines. Some older machines have a higher CPU speed with only one core while a newer machine may have multiple cores that are lower power. When building an application is single threaded because of the way our compiler works. If your older machine has a higher speed on one CPU that could make a difference although I wouldn't expect it to be a huge difference

How much shorter is your build?

Share this post


Link to post
Share on other sites

Hi Jon,

  • Window 7 32 bit on a three years old notebook (dual core centrino): takes about 9..12 minutes
  • Windows 7 64 bit on a DELL Precision T5500 (Xeon X650 12 GB RAM): takes 28 minutes

There are other applications (CPU intensive I mean) on both systems, their performance are as expected on these machines

Only LabVIEW is extremely slow (I can notice the difference when I open projects, do "Save all", and all other commands which involve many files, like building executables)

Share this post


Link to post
Share on other sites

Hmm..."commands which involve many files" screams virus protection software to me. Are all of the problem areas you see when doing disk I/O? I know I've seen cases where building a large project in Visual Studio can be slowed down by a factor of 2 or 3 if you have a virus protection on. Can you try disabling this or any file indexing programs that you could have on these machines? If it is a virus protection you could see about adding exceptions to folders that you commonly build to.

Share this post


Link to post
Share on other sites

I've disabled file indexing and virus protection: nothing is changed. At this point, I wait for LV2012: perhaps it performs better on 64-bit OS. thanks to everybody for your patience !!

Share this post


Link to post
Share on other sites

Editor performance while doing saves or folder moves is very slow. IDE/dependency management in Labview with classes could uses some templates or guidelines.

Share this post


Link to post
Share on other sites
Editor performance while doing saves or folder moves is very slow. IDE/dependency management in Labview with classes could uses some templates or guidelines.

 

One thing that hits LabVIEW saves and loads quite badly are certain Virus Scanners. They seem to want to intercept every single LabVIEW disk access and that can mount up to really long delays and totally maxed out cores. Some virus scanners are worse than others but all of them have some influence.

 

Another reason for long delays when trying to open a file can be network paths installed in the explorer workspace (either mapped to a disk drive letter or simply as network path). If some of these paths are not currently available Windows tends to run into timeout delays for various file access routines including opening the file dialog box. For some reason Windows seems to enumerate the whole desktop workspace repeatedly on such actions and queries all those paths and if the network server that these paths connect to is slow to respond or not available, that enumeration can really take several seconds and more.

  • Like 1

Share this post


Link to post
Share on other sites
One thing that hits LabVIEW saves and loads quite badly are certain Virus Scanners. They seem to want to intercept every single LabVIEW disk access and that can mount up to really long delays and totally maxed out cores. Some virus scanners are worse than others but all of them have some influence.

 

In the same vein, services like Dropbox can also prove problematic though not quite as intrusive.  I sync many projects with my laptop via dropbox and during builds LV generates files and wants to delete them and rename them. The problem is that my machine/internet/dropbox service is running too quickly about 75% of the time and accesses the file during the build process which generates an error or just slows things down in general.

  • Like 1

Share this post


Link to post
Share on other sites
In the same vein, services like Dropbox can also prove problematic though not quite as intrusive.  I sync many projects with my laptop via dropbox and during builds LV generates files and wants to delete them and rename them. The problem is that my machine/internet/dropbox service is running too quickly about 75% of the time and accesses the file during the build process which generates an error or just slows things down in general.

I had so many problems with Dropbox (and SugarSync) trying to sync source/builds across machines I gave up completely and now just rely on my VCS.

  • Like 1

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.