Jump to content


Popular Content

Showing content with the highest reputation since 11/15/2019 in Posts

  1. 6 points
    View File Hooovahh's Tremendous TDMS Toolkit This toolkit combines a variety of TDMS functions and tools into a single package. The initial release has a variety of features: - Classes for Circular, Periodic, Size, and Time of Day TDMS generation with examples of using each - Reading and Writing Clusters into TDMS Channels - XLSX Conversion example - File operations for combining files, renaming, moving, and saving in memory to zip - Basic function for splitting TDMS file into segments (useful for corrupt files) - Reorder TDMS Channel with Demo There is plenty of room for improvements but I wanted to get this out there and gauge interests. The variety of classes for doing things, along with VIMs, and class adaptation makes for using them easier. If I get time I plan on making some blog posts explaining some of the benefits of TDMS, along with best practices. Submitter hooovahh Submitted 12/12/2019 Category *Uncertified* LabVIEW Version 2018 License Type BSD (Most common)  
  2. 6 points
    All of the presentations are now on the LabVIEW Wiki. You can find them at: https://labviewwiki.org/wiki/Americas_CLA_Summit_2019 Thanks Kevin Shirey and Mark Balla for producing the videos and all those that volunteered to run the cameras. This is an awesome resource to be able to go re-watch and review these great presentations again or for those that couldn't join us in person to be able to view them as well.
  3. 3 points
    No Classic LabVIEW doesn't and it never will. It assumes a string to be in whatever encoding the current user session has. That's for most LabVIEW installations out there codepage 1252 (over 90% of LabVIEW installations run on Windows and most of them on Western Windows installations). When LabVIEW classic was developed (around end of the 80ies of the last century codepages was the best thing out there that could be used for different installations and Unicode didn't even exist. The first Unicode proposal is from 1988 and proposed a 16 bit Unicode alphabet. Microsoft was in fact an early adaptor and implemented it for its Windows NT system as 16 bit encoding based on this standard. Only in 1996 was Unicode 2.0 released which extended the Unicode character space to 21 bits. LabVIEW does support so called multibyte character encodings as used for many Asian codepages and on systems like Linux where nowadays UTF-8 (in principle also simply a multibyte encoding) is the standard user encoding it supports that too as this is transparent in the underlaying C runtime. Windows doesn't let you set your ANSI codepage to UTF-8 however, otherwise LabVIEW would use that too (although I would expect that there could be some artefacts somewhere from assumptions LabVIEW does when calling certain Windows APIs that might not match how Microsoft would have implemented the UTF-8 emulation for its ANSI codepage. By the time the Unicode standard was mature and the various implementations on the different platforms were more or less working LabVIEW's 8-bit character encoding based on the standard encoding was so deeply engrained that full support for Unicode had turned into a major project of its own. There were several internal projects to work towards that which eventually turned into a normally hidden Unicode feature that can be turned on through an INI token. The big problem with that was that the necessary changes touched just about every code in LabVIEW somehow and hence this Unicode feature is not always producing consistent results for every code path. Also there are many unsolved issues where the internal LabVIEW strings need to connect to external interfaces. Most instruments for instance won't understand UTF-8 in any way although that problem is one of the smaller ones as the used character set is usually strictly limited to ASCII 7-bit and there the UTF-8 standard is basically byte for byte compatible. So you can dig up the INI key and turn Unicode in LabVIEW on. It will give extra properties for all control elements to set them to use Unicode text interpretation for almost all text (sub)elements instead but the support doesn't for instance extend to paths and many other internal facilities unless the underlaying encoding is already set to UTF-8. Also strings in VIs while stored as UTF-8 are not flagged as such as non Unicode enabled LabVIEW versions couldn't read them, creating the same problem you have with VIs stored on a non Western codepage system and then trying to read them on a system with a different encoding. If Unicode support is an important feature for you, you will want to start to use LabVIEW NXG. And exactly because of the existence of LabVIEW NXG there will be no effort put in LabVIEW Classic to improve its Unicode support. To make it really work you would have to rewrite large parts of the LabVIEW code base substantially and that is exactly what one of the tasks for LabVIEW NXG was about.
  4. 3 points
    I am very excited about the potential for a platform that encourages opensource collaboration on LabVIEW code. My main experience of non LabVIEW package managers is with NPM for Node.js. NPM is an organisation which provides two things - a tool which is the mechanism for managing what packages are used in a project and a registry that allows for anyone to publish their packages to. I believe that NPM supports private packages for enterprise customers but open source packages are generally hosted on github and when a package is uploaded to the NPM registry it simply pulls in the README to provide the package documentation. The github link is also provided on the NPM page so that users can easily see where the library comes from and if they want to open issues or submit fixes then they do that on github. I have not had much of a chance to look at it but it appears like GPM would/does follow similar mechanics to NPM and compared to VIPM and NIPM I am certainly most excited about the GPM model. I see GCentral as a organisation that could provide the registry for packages and ideally be the one place for opensource LabVIEW code (including NI-community page hosted code) with clear signposts as to where to find the source for issue raising and forking. One issue that many text based languages don't have is that users with older versions of labVIEW cannot even open code made with newer versions of LabVIEW, let alone run it - so maybe GCentral could provide some computing power (and licences) to automatically convert VIs to older versions - even if they didn't run, at least a user could open them.
  5. 3 points
  6. 3 points
    I assume you meant this video? There is this older video of Dr. T and Jeff K. introducing a LabVIEW Basics Interactive CD-ROM (~LabVIEW 4), but it's not as exciting as the LabVIEW 5 promo.
  7. 2 points
  8. 2 points
    I have a few things that might fit a user story - but also some more freeform feedback too - I'm excited to see some effort and thought going into this area - although early discussions sounds like a lot of technology work where I think there is a solid base already. To backup Joergs point as well - I want to use github/gitlab etc. for my open source projects - leveraging what is already there makes it easier to find resources, help and allows me to translate my experience between languages. There should really only be LabVIEW specific elements where that is absolutely necessary IMHO. Discoverability is kind of interesting - but when all the other package managers already have sites for this, it doesn't feel like very low hanging fruit. Pushing collaboration feels like a great approach. Getting more people owning code and publishing it in a way that people can collaborate feels like something that is lacking. There are so many projects that appear as forum posts or packages on the tools network that lack a public issue tracker, code repo and other things that can impact collaboration. Better still this is mostly an education play requiring less investment and can leverage loads of great existing resources like opensource.guide. Having a non-gated repo would be of interest though - even if you can just link to a package to simplify the infrastructure that would be a great start.
  9. 2 points
    Merry Xmas:Symlink.llb
  10. 2 points
    Turns out there is a way to do it from within the IDE without mucking about with copying files etc (100% not obvious though). I stumbled on this totally epic write up/demo by Matthias Baudot. https://www.studiobods.com/en/niweek2019-ts170/
  11. 2 points
    Where can I sign the petition to hand over NI's social media accounts to you?
  12. 1 point
    To all the warriors out there a heads up if your computer is suddenly running at 100% CPU. In the Task Manager, the Task Manager process will gobble up any CPU available. I swore it was a virus. A 3rd party tech support knew about this. Turns out a camera did not shut down correctly. To avoid Windows willy-nilly starting a process it likes that will screw up tasks such as frame grabbing response this Windows setting is used. Use PowerShell & run the following: "C:\Program Files\Point Gey Research\FlyCap2 Viewer\bin64\PGRIdleStateFix.exe enable"
  13. 1 point
    Hi, I'm having problems building a vipc from a vipb with files containing nested vims. Getting the following error from VIPM: ERROR: 7: VIPM API_vipm_api.lvlib:Parse Build Return Message_vipm_api.vi<ERR> Code:: 7 Source:: 0053C289D635723F5DC0A4F08297566A<ERR> The following source VIs or Libraries are missing. Please correct this problem before rebuilding. b39afad9-8321-4719-86a9-dddab325fc87.vi The following source VIs or Libraries are the callers of missing files BitsSetter.vim I created a zip with the vims and the vipb file. Any suggestions how to fix this? Opening the files shows no errors. Replacing the nested vim with its actual implementation fixes the problem but I don't want to give in just yet. I'm on LV 18.0.1f4 64bit with VIPM 2018.0.0f1 Cheers bits.zip
  14. 1 point
    It was not developed in a bubble. There was already a close relationship between the VIPM team and NI. So they knew the requirements and the need. It's been over 5 years since the release of NIPM, so not much movement however... That might have been the first step, but hardly the long-term plan. In reality, I think the main reason for lack of development on NIPM is that people got shuffled around and new management came in and the original development plans for NIPM got put on a shelf. I can tell you that NI had the goal of full VIPM replacement. There is much churn internally at NI on where to take NIPM next. They are back to wanting to add features to facilitate better reuse library installation for current LabVIEW (how to achieve this is not clear). For sure however, this is the clear case with NXG. I suggest you watch this video:
  15. 1 point
    NI caused this problem themselves. NIPM should have provided all the features of VIPM from the start and then added GPM features. Lack of investment. Now they're playing catchup, but technology is moving on.
  16. 1 point
    That's not as easy even if you leave away other platforms than Windows. In old days Windows did not have support preinstalled for all possible codepages and I'm not sure it does even nowadays. Without the necessary translation tables it doesn't help if you know what codepage text is stored in so translation into something else is not guaranteed to work. Also the codepage support as implemented in Windows does not allow you to display text in a different codepage than what is currently active and even if you could switch the current codepage on the fly all text previously printed on screen in another codepage would suddenly look pretty crazy. While Microsoft had support for Unicode initially only for the Windows NT platform, (which wasn't initially supported by LabVIEW at all) they only added a Unicode shim to the Windows 9x versions (which were 32 bit like Windows NT but with a somewhat Windows 3.1 compatible 16/32 bit kernel around 2000 by a special Library called Unicows (Probably for Unicode for Windows Subsystem) that you could install. Before that Unicode was not even available on Windows 95. 98 and ME, which was the majority of platforms LabVIEW was used on after 3.1 was kind of dieing. LabVIEW on Windows NT was hardly used despite that LAbVIEW was technically the same binary than for the Windows 9x versions. But the hardware drivers needed were completely different and most manufacturers other than NI were very slow to start supporting their hardware for Windows NT. Windows 2000 was the first NT version that saw a little LabVIEW use and Windows XP was the version where most users definitely abandoned Windows 9x and ME for measurement and industrial applications. That only would have worked if LabVIEW for Windows would use internally everywhere the UTF-16 API, which is the only Windows API that allows to display any text on screen independent of codepage support, and this was exactly one of the difficult parts to get changed in LabVIEW. LabVIEW is not a simple notepad editor where you can switch the compile define UNICODE to be defined and suddenly everything is using the Unicode APIs. There are deeply ingrained assumptions that entered the code base in the initial porting effort that was using 32-bit DOS extended Watcom C to target the 16-bit Windows 3.1 system that only had codepage support and no Unicode API whatsover and neither had the parallel Unix port for the Sun OS, which was technically Unix SRV4 but with many special Sun modifications, adaptions and special borks built in. It still allowed eventually to release a Linux version of LabVIEW without having to write an entirely new platform layer but even Linux didn't have working Unicode code support initially. It took many years before that was sort of standard available in Linux distributions and many more years before it was stable enough that Linux distributions started to use UTF-8 as standard encoding rather than the C runtime locals so nicely appreaviated with EN-en and similar which had no direct mapping to codepages at all. But Unix while not having any substantial Unicode support for a long time eventually went a completely different path to support Unicode than what Microsoft had done. And the Mac port only learned to have useful Unicode support after Apple eventually switched to their BSD based MacOS X. And neither of them really knew anything about codepages at all so a VI written on Windows and stored with the actual codepage inside would have been just as unintelligent for those non-Windows LabVIEW versions as it is now. Also in true Unix (Linux) way they couldn't of course agree on one implementation for a conversion API between different encodings but there were multiple competing ones such as ICU and several others. Eventually the libc also implemented some limited conversion facility although it does not allow you to convert between arbitrary encodings but only between widechar (usually 32-bit Unicode) and the currently active C locale. Sure you can change the current C locale in your code but that is process global so it also affects how libc will treat text in other parts of your program which can be a pretty bad thing in multithreading environments. Basically your proposed codepage storing wouldn't work at all for non-Windows platforms and even under Windows only has and certainly had in the past very limited merit. You reasoning is just as limited as the original choice of NI was when they had to come up with a way to implement LabVIEW with what was available then. Nowadays the choice is obvious and UTF-8 is THE standard to transfer text across platforms and over the whole world but UTF-8 only got a viable and used feature (and because it was used also a tested, tried and many times patched one to work as the standard had intended it) in the last 10 to 15 years. At that time NI was starting to work on a rewrite of LabVIEW which eventually turned into LabVIEW NXG.
  17. 1 point
    This discussion on package manager capabilities definitely needs to happen but just to summarize Chris' post: Therefore, will use what we have and improve as we go.
  18. 1 point
    There has been a lot of discussion, which is great, but I feel the need to reiterate GCentral's vision and mission. GCentral envisions a LabVIEW community making the best version of itself by improving its capability through collaboration. GCentral is a non profit organization: for programmers who need to find, share or collaborate on G reusable code or software engineering tools. that provides a platform for G code packages and collaboration resources. that is independent and driven by community experts. GCentral's Mission Enable LabVIEW programmers to collaborate by removing barriers to finding / using code designed for reuse (packaged code) removing barriers to contributing code designed for reuse (packaged code) removing barriers to co-developing code using code with confidence GCentral is package technology agnostic / SCC agnostic GCentral does not endorse or encourage the use of one package manager over the other, nor will we. Each community member can package their code according to their preference. GCentral does not endorse or encourage the use of one Source Code Control Provider (local or cloud based) over the other nor will we. Each community member can use the SCC they prefer. GCentral will ease the pain we all feel when attempting to find and use packages by index the currently available public repositories (Tools Network, GPM, JKI Tools, NI Packages) by indexing an new, un-gated, cloud based storage location that can house any package type. by displaying the index results in a web page / APIs, etc (see https://www.gcentral.org/ for the prototpye) GCentral will ease the pain we all feel when attempting to contribute packages by creating new, un-gated, cloud based storage location that can house any package type (not source). MAYBE creating software to transport built packages from build machine to the new cloud storage location GCentral will ease pain we feel when attempting to co-develop code by Creating template projects for each of the major online SCC. (GitHub, etc). Coming pre-configured to build the package type of your choice and upload to the GCentral package server. GCentral will inspire confidence by Making any submitted package always available. Once submitted, a package cannot be deleted apart from a GCentral administrator. As a result, you can depend on a package without fear of it ever missing. Product pages per package designed to educate on the package and author. The above is a summary of the CLA summit presentation I gave (https://sites.google.com/gcentral.org/website/about-gcentral) The advent of the GitHub Package Registry is very interesting. I've reached out to GitHub to provide clarity on how extensible their framework is. At time 29:44 in the presentation Michael linked above the presenter says "We have a great extension framework for adding support for new registries, which will be opening up in the future". That MAY mean we can provide plugins for their registry to recognize NIPKGs, VIPs, GPKGs. And that may completely solve the "find/use" pain point i mention above... so long as the community is ok putting their packages in GitHub AND sacrificing confidence that the package will always be available to use or link against. In conclusion, GCentral's aim is to impose the least amount of infrastructure on a community member while enabling us to find/use, contribute, co-develop packages designed for reuse. GCentral will use already existing technologies to accomplish its goal and create new technologies where needed.
  19. 1 point
    @joerghampel, GPM does allow local repos and GPM is open source. Link to MGI’s GPM Wiki: https://gitlab.com/mgi/gpm/gpm/wikis/How-Tos/Filesystem-Based-Registries Link to the GPM project: https://gitlab.com/mgi/gpm Even I have contributed to GPM along with some of the Composed Systems guys. @JamesMc86 I’m working on a palette editor now which can be used stand-alone but I’ll work with MGI to add it to GPM.
  20. 1 point
    As yet it doesn't really support code add-ons is my understanding (at least not well). For example I think it still lacks any support for palette editing.
  21. 1 point
    I feel this is somehow related. GitHub supports packages as described here: https://help.github.com/en/github/managing-packages-with-github-packages Do you think this is something that we can utilize for LabVIEW package distribution?
  22. 1 point
    Lately, I've been using a phrase "oh, that's a left-handed scissors feature." I've found it to be a useful concept, so I'm posting it here to the LV community generally. When software engineers create software, we aim to create software that is usable by our customers. To use the colloquialism: the software should delight the user. We sometimes miss. But just because the software isn't designed as the user expects does not mean that it wasn't designed somehow. Often, I find myself fighting my tools (Visual Studio, Perforce, web browser, operating system, etc.) because it just isn't doing what I expect, and I'm constantly working around some issue. I've realized that sometimes, if I stop and think about how the software is designed, I can use the system the way it was intended, not the way I want to use it. I'm still not happy, but I'm working a lot less and hurting less. I've begun referring to such designs as "left-handed scissors". Sure, 90% of humans are right-handed, and, since there's only one pair of scissors, the scissors should have been designed to be right-handed. Or the system should let you configure left-or-right. Something should be different! But it isn't different. So as a user, I have a choice -- to hold the scissors in my right hand and try cutting or to hold the scissors in my left-hand and try cutting. I can fight the system and stress my hands and make the software work, damnit, using my right-hand, or I can use the left-hand. Using my left hand, I'm still having to work harder than I'd like, but at least the edges aren't digging into my fingers. There are features in LabVIEW that are left-handed scissors. We -- LabVIEW R&D -- should have done something different. You know that; we know that (now). But the decision was made, and there hasn't been developer bandwidth to fix it. And it sucks. But it can suck a lot or it can suck a little, depending upon whether or not you acknowledge that it was designed for the wrong hand. Packed project libraries are left-handed scissors. XControls are left-handed scissors. There's others, but those are the two big ones that I most commonly have to help customers with. Both are great features if you use them as intended. But no one wants to use them as intended... we all want to use them in ways that they just don't want to be used. They can do the job that we want them to do, just not in the way we want them to do it. In the case of XControls, you have less pain if you just accept the frustrating event model that they're predicated on. In the case of packed project libraries, you have less pain if you just accept the limitations of their dependency paths and design build specs accordingly. I'm not going into either of those statements in this post... I and others have talked about both in various other forums. My point here is to coin a phrase that I've found useful. And if you hear me talking about left-handed scissors, now you know my intended meaning. 🙂
  23. 1 point
    You can add text labels to a dial. It does not turn it into an enum but sort of works like you would expect (change the data type to U8).
  24. 1 point
    Note: if you have a Tools Network package, you can request assigned ranges of error codes for just your library. For example, "SQLite Library" errors start at 402860
  25. 1 point
  26. 1 point
  27. 1 point
    Security in the age of cloud computing and IoT is a huge challenge. I do not think we should start on that discussion here. But you guys seem to assume that that means we can resort to the old ways of doing business - That NI and others should not leverage these things fully, but try to guide their users to the old ways by making sure the new ones are intentionally crippled. The fact is that most of us are way down the rabbit hole already, ignoring the risks because the benefits are too enticing or the business or societal pressure too high. If people can make a business of delivering services that are at the same level of risk as the customer is already taking in other areas (in fact in the particular case that triggered my interest in this - the security would be improved compared to the current solution - imagine that), but NI is holding them back because they think the security challenges has to be 100% solved first...well...that is a recipe for a dwindling business. The starting point of my digression was something that the supplier in fact is already partly working on. They just have not gotten around to it yet. So it is not like you are defending something that they themselves think is the holy grail of security limitations either. Arguing that the current solution is as good as it gets is never really a winning strategy.
  28. 1 point
    Old fashioned it may be but you seem to have a surprising high trust in Microsoft, Google, Amazon and CO to not only be able to secure their infrastructure beyond any doubt but also to never turn evil from inside and scan your files no matter what. Ohh wait Google does that already, their user rights document you have to agree to to use their cloud services specifically states that they are allowed to scan your documents for service improvement purposes. In terms of Google that also means to improve their ad business revenues, it's after all still their core business with which they grew to the multi-multi billion $$ business they are nowadays. Sure they have other business diversifications that start to get important too, but the "Don't do evil" slogan from early days Google has long been abolished. 😄 Microsoft, Amazon and all the rest are businesses too, and when they believe they can get away with something to improve the revenue numbers under the final accounting line they will do it, as is their holy duty in the name of our modern day golden calf called shareholder value! 😄 But the real danger is not only evil intent, it is simply neglect! There is not a single week in recent years where not some server out there is discovered to expose very sensitive data to the broad lovely and always friendly internet. Cloud services being the core business of those business units from these companies makes them of course try hard to avoid such issues but errors often lie in very small corners but are so easily done. System Link cloud service being in fact outsourced to a real cloud service (NI doesn't want to go into running their own cloud service business anytime soon) doesn't make that less likely. It's simply one more intermediatery between your client, you, NI and the cloud service, that can add a critical error source.
  29. 1 point
    No. Those were built exactly to customer spec and do exactly what they promise to do. The fact that they’re a horrible idea doesn’t make them left-hand scissors. There isn’t a better design for shared variables that would make them safer or more practical. Shared variables are more like the 1950s kids’ science kits that shipped with pieces of actual uranium to play with.
  30. 1 point
    GPM extracts code into your project, BTW. VIPM, NIPM and GPM provide built products of reusable code, among other things. I believe what you are saying is that you consider existing, cloud-based source code control tools such as GIT as a viable option for reusable code distribution. This shouldn't be limited to packaged stuff.
  31. 1 point
    The first discussion with Fab, Dani and Jerzy Kocerka was about where to keep the code. We quickly came to the conclusion that it would be great if GCentral did not host their own repositories on their own servers, but rather was able to tap into existing services like Github, Gitlab, Bitbucket and so on. That might also help with acceptance. Personally, I would like to keep our code in our own repos at gitlab.com. We have our readme's, our documentation platform, etc etc. But if there was an easy way to plug into the GCentral website of available code, I'd love to register our offerings (whatever that might be worth!) and see it featured there. And also the other way around: I'd like it if I could find not only properly packaged code from the three main repositories (VIPM, NIPM, GPM) on GCentral, but also other offerings in other formats. We like to keep as many dependencies as possible inside our project directories, so we work a lot with packages that are not installed via a package manager but either extracted/copied into the project directory or maybe sometimes linked as git submodules.
  32. 1 point
    Thousends of releases? I kind of doubt it. Leaving away LabVIEW prior to the multiplatform version (2.2.x and earlier which only were Macintosh) there have been 2.5, 3.0, 3.1, 4.0, 5.0, 5.1, 6.0, 7.0, 7.1, 8.0, 8.2, 8.5, 8.6, 2009, 2010, 2011, 20012, 2013, 2014, 2015, 2016, 2017, 2018, 2019 releases so far. Of each of them there was usually 1 and rarely two maintenance releases, and of each maintenance release between 1 to 8 bug fix releases. This does probably only amount to about 100 releases in total and maybe another 50 for beta releases of these versions (a beta has usually 2 to 4 intermediate releases although that tends to be no more than 2 in the last 10 years or so). I'm not aware of any LabVIEW release that had the debug symbols exposed. PDPs were used even in Microsoft Visual C 6.0, the first version that was probably used by NI to create a released LabVIEW version (NI only switched to use Microsoft C for the standard 32-bit builds for Windows NT, the Windows 3.1 versions of LabVIEW were created using Watcom C 10.x which was the only compiler able to create full 32-bit executables that could run on 16-bit Windows 3.1 through the built-in DOS extender runtime). Microsoft makes this anyhow pretty hard to happen by accident as such DLL/EXE files would normally link to the debug version of the Microsoft C Runtime library and you can't install that legally on a computer without installing the entire Microsoft C Compiler contained in the Visual Studio software. There is definitely no downloadable installer for the debug version of the C runtime engine. The only early leak I'm aware of was that the original LabVIEW 2.5 prerelease contained a huge extcode.h file in the cintools directory that showed much more than the officially documented LabVIEW manager functions. About half of it was still pretty much unusable as you needed other functions that were not exposed in there to make use of some of the features, and a lot of those functions were removed from the exported functions around LabVIEW 4.0 and 5.0 as they were considered obsolete or undesirable to pollute the exported symbols list, but it did contain a few interesting functions that are still exported from the LabVIEW kernel but not declared in the current extcode.h file. They fixed that extcode.h bug before the official release of LabVIEW 3.0, which was the first non-beta version of LabVIEW running on other computers than Macintosh. (2.5 was basically a beta release called prerelease version to have something to show for NI Week 1992 that runs on Windows 3.1, and there was a 2.5.1 and I believe 2.5.2 bug fix release of it in 1993). Also lvrt.dll is a development that only got introduced around LabVIEW 6.0. As this was released in 2000 it most likely used at least Microsoft Visual Studio C++ 6.0; Before that the application builder was concatenating the generated runtime LLB to a stub executable that contained the entire LabVIEW runtime engine. That was a pretty neat feature as it created a single file executable, but as LabVIEW was extended and more and more functionality implemented in external files and DLLs that get linked dynamically, this was pretty much unmaintainable in the long run and the entire runtime engine was externalized.
  33. 1 point
    You found the solution! Actually, the Facade vi runs transparent by default. If I unselect 'Window runs transparently' in the Custom Window Appearance dialog, it works as expected. No workarounds using tab controls/background images necessary
  34. 1 point
    I do not think that XControls draw their pane, that would become quite the pain when placing them on different front panels with different colors. What I would do is place a single page tab control on the facade (hide the tabs) and put your other controls on it. Control the background color using the FGColor property of the Tab control.
  35. 1 point
    As a lefty, don't even get me started on can openers.
  36. 1 point
    I’m pretty sure that’s the easiest to implement, so I’m liking your opinion. (Trying to be non-biased, I do think that’s the better option from usability. If I pick a name in a template, it’s because I expect the provided thing to use that name. It is consistent with the class method behavior, but for cluster fields.)
  37. 1 point
    I'm favouring just freeze the name, as that is simplest for the User to understand, given that it is difficult to diagnose, let alone fix, any problem if the name adaption goes wrong.
  38. 1 point
  39. 1 point
    The Beta of LabVIEW Community Edition is now available at http://ni.com/beta
  40. 1 point
    It seems like OpenG needs a landing page. Perhaps if we use GitHub Pages for the GitHub repos, as proposed here. This could be the new URL to include in licensing.
  41. 1 point
    Thanks. Here's a more developed subVI that I've used successfully inside a Pre-build action. I use it to set a Build CC symbol to be equal to the Build Name, so different builds can enable different code. Set CC Symbol.vi
  42. 1 point
    Sorry, my examples posted earlier were in 8.5.1. Here they are back saved to 8.0. Bruce Create Shortcut.vi Set Shortcut Properties.vi
  43. 1 point
    I use these VIs. They're from a bunch of Windows utility VIs that I have. Bruce Create Shortcut.vi Set Shortcut Properties.vi

  • Create New...

Important Information

By using this site, you agree to our Terms of Use.