jgcode Posted August 3, 2010 Report Posted August 3, 2010 I don't particularly like how that option is worded, but that's my understanding of what it's supposed to mean Damn, I was way off with my interpretation. Quote
crossrulz Posted August 3, 2010 Report Posted August 3, 2010 Damn, I was way off with my interpretation. So was I. Luckily Christina set me straight in her blog post. Quote
Daklu Posted August 3, 2010 Report Posted August 3, 2010 Thanks Darren and Crossrulz for clearing that up. Quote
SuperS_5 Posted July 28, 2011 Report Posted July 28, 2011 Every time I have seen this problem, the computer did not have enough RAM, and the swap file was heavily used. Upgrading the system RAM solved the problem. If you cannot upgrade the system RAM, and have windows 7, then ReadyBoost can be used to augment the system RAM. I currently have 8GB of RAM. (I was running out at 6GB) Quote
Thomas Robertson Posted yesterday at 01:11 AM Report Posted yesterday at 01:11 AM Just wanted to chime in on this Zombie thread and say I have all of these problems and it's driving me crazy. LV2023, Roughly 18,000 vis in the project. Quote
Neil Pate Posted yesterday at 08:04 AM Report Posted yesterday at 08:04 AM @Thomas Robertson 18k VIs? OK, that is quite a big project. Have you tried splitting things up, maybe introducing a few PPLs? Quote
hooovahh Posted yesterday at 12:50 PM Report Posted yesterday at 12:50 PM 11 hours ago, Thomas Robertson said: Just wanted to chime in on this Zombie thread and say I have all of these problems and it's driving me crazy. LV2023, Roughly 18,000 vis in the project. My projects can be on that order of size and editing can be a real pain. I pointed out the difficulties to Darren in QD responsiveness and he suggested looking for and removing circular dependencies in libraries and classes. I think it helped but not by much. Going to PPLs isn't really an option since so many of the VIs are in reuse packages, and those packages are intended to be used across multiple targets, Windows and RT. This has a cascading affect and linking to things means they need to be PPLs, made for that specific target, and then the functions palette needs to be target specific to pull in the right edition of the dependencies. AQ mentioned a few techniques for target specific VI loading, but I couldn't get it to work properly for the full project. Quote
Softball Posted 21 hours ago Report Posted 21 hours ago (edited) Hi NI has always tried to 'optimize' the compiler so the code runs faster. In LabVIEW 2009 they introduced a version where the compiler would do extra work to try to inline whatever could be inlined. 2009 was a catastrophe with the compiler running out of memory with my complex code and NI only saved their reputation by introducing the hybrid compiler in 2010 SP1. Overall a smooth sailing thereafter up to and including 2018 SP1. NI changed something in 2015, but its effect could be ignored if this token was included in the LabVIEW.in file : EnableLegacyCompilerFallback=TRUE. In LabVIEW 2019 NI again decided to do something new. They ditched the hybrid compiler. It was too complex to maintain, they argued. 2019 reminded me somewhat of the 2009 version, except that the compiler now did not run out of memory, but editing code was so sloow and sometimes LabVIEW simply crashed. NI improved on things in the following versions, but they has yet to be snappy ( ~ useful ) with my complex code. Regards Edited 21 hours ago by Softball clarification Quote
ShaunR Posted 20 hours ago Report Posted 20 hours ago 58 minutes ago, Softball said: Hi NI has always tried to 'optimize' the compiler so the code runs faster. In LabVIEW 2009 they introduced a version where the compiler would do extra work to try to inline whatever could be inlined. 2009 was a catastrophe with the compiler running out of memory with my complex code and NI only saved their reputation by introducing the hybrid compiler in 2010 SP1. Overall a smooth sailing thereafter up to and including 2018 SP1. NI changed something in 2015, but its effect could be ignored if this token was included in the LabVIEW.in file : EnableLegacyCompilerFallback=TRUE. In LabVIEW 2019 NI again decided to do something new. They ditched the hybrid compiler. It was too complex to maintain, they argued. 2019 reminded me somewhat of the 2009 version, except that the compiler now did not run out of memory, but editing code was so sloow and sometimes LabVIEW simply crashed. NI improved on things in the following versions, but they has yet to be snappy ( ~ useful ) with my complex code. Regards I still use 2009 - by far the best version. Fast, stable and quick to compile. 2011 was the worst and 2012 not much better. If they had implemented a benevolent JSON primitive instead of the strict one we got, I would have upgraded to 2013. Quote
Softball Posted 3 hours ago Report Posted 3 hours ago Hi again The topic has been discussed in the past and is even mentioned in the manual : Discussion ( even AQ was active then in 2016 ) : https://forums.ni.com/t5/LabVIEW/Compiler-optimisations-and-IPE/m-p/3302614 Manual : https://www.ni.com/docs/en-US/bundle/labview/page/choosing-between-editor-responsiveness-and-vi-execution-speed.html My story about a slow editing experience and what was done over time may help others to either avoid creating large systems, or stick with LabVIEW versions that can handle that. Regards Quote
Thomas Robertson Posted 1 hour ago Report Posted 1 hour ago After consulting some co-workers I reverted recent changes to my project and now everything is fast again. They reported similar observations when they had similar problems. Our theory is that as things change something in the project gets out of whack, maybe it's sense of dependencies I don't know and gets saved into the project file. Anyway, I'm back to normal speed editing. As to breaking apart the 18K vi project, we've discussed it for a decade now. We have plans to try to decouple one part of the code from the rest because it seems to be a major culprit for why everything relies on everything else. But, this decoupling effort feels like trying to boil the ocean. We're dealing with a 30 year old code base and lots of the communication and subvi calling was defined long before there were libraries and public private concepts. If we can ever get them more isoltated we might consider PPL's. I feel like those had issues with cross platform compatibility (Windows/Mac) though. Quote
crossrulz Posted 1 hour ago Report Posted 1 hour ago 7 minutes ago, Thomas Robertson said: As to breaking apart the 18K vi project, we've discussed it for a decade now. We have plans to try to decouple one part of the code from the rest because it seems to be a major culprit for why everything relies on everything else. But, this decoupling effort feels like trying to boil the ocean. Yes, decoupling is a long process. I have been trying to do that with the Icon Editor. Not trying right now as other priorities with that project are more important. Do the process slowly and make deliberate efforts. You will eventually see the benefits. Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.