Jump to content

ShaunR

Members
  • Posts

    4,975
  • Joined

  • Days Won

    310

Everything posted by ShaunR

  1. Not that I'm aware. Sort of. See next. The classic solution is to make access to the dynamic VIs loaded by a relative path. Then you can have a directory that you just add dynamic VIs/Classes to. This is slightly different to what you are describing as they are not bundled or built with the exe. Instead, they are usually supplied and managed as a separate installation. But they will not break the exe if not present (just that feature/function will not be available) and are flexible to add/remove functionality. The issues with this approach are that the "modules" have to have diagrams attached otherwise they cannot be loaded into different versions of LabVIEW. So stripping the diagrams creates version and maintenance problems but as the password protection of some LabVIEW versions was compromised (the best solution); it is unsavoury for commercial products. Distribution of the total solution is problematic from a NI only stand-point too. It can easily be achieved via a VIPM package, but NI don't own that. So Alice would distribute a VIPM package with a dependency on the modules that would get pulled in and installed to the right place alongside the exe. Using NI builder solutions are much less elegant requiring a source distribution for the modules and an exe build for the main VI that must be modified to get to install in the right locations on another machine or labview version (because of absolute paths) and requires work from Bob to realise. Yes.
  2. Hear, hear. The palettes are hopelessly restrictive and haven't really changed since 2.x. I would like to see dockable palettes that I can attach to the screen edges (and to each other) and be able to create palettes that I can drag and drop primitives onto as a quick favourites for that project. I've had a play with the "Smart palette" and it's good. But not exactly what I would like. Still hate the probe window so much that I'd rather create temporary indicators on FPs rather than use it (especially for strings which you cannot format). And while I'm ranting Are they ever going to fix the greying out during execution highlighting that seems to forget and get confused as to where and what it should grey out.
  3. Agreed. You are much better of testing for an invalid refnum instead. Then it will be created on the first call and if it ever becomes invalid - recreated.
  4. I've just been bitten by this and had to convert community scoped functions to public..
  5. I do all my development in 2009 still (although I have all the versions).
  6. Well. To all intents and purposes, There is no difference between that and using a non-re-entrant global-just more complicated.They use the DVR technique in POOP because they don't have a method of creating singletons easily. With a LV2global, it is inherently a singleton as all VIs are singletons by default.
  7. I would turn your aquisition on its head. Stream directly to the TDMS DB and just query it when you want to do analysis. You would then only have the data in memory that you need and no copies.
  8. Why not just use a DVR and the "global clone" just accesses the DVR
  9. Right. so that's a "no" then In an exe,paths are different from development as the executable filename is included. Additionally, you placed the DLL in the data directory and there is no appending of the data directory in your path code. To test, replace your path code in the image with an "Application Directory.vi", append the DLL name, and make sure the dll is in the same directory as the exe. This will ensure that when you build an exe, the dll is being picked up from the application directory regardless of how paths to VIs and libs mutate. NB: "Application Directory.vi" gives the path to the directory in which your project file resides when in the development environment.
  10. Is that where the VI that dynamically loads the DLL expects it to be?
  11. It looks like you are dynamically loading the dll with CL Test.vi (difficult to tell exactly because it's in evaluation therfore diagrams are locked and I cannot try a build). Therefore LabVIEW dosn't know about the DLL. You will have to add the OpenCLV_xXX.dll directly as jack states (not the lvlib, which you may have thought jack meant by library) and make sure it goes into the directory passed to CL Test.vi.
  12. I had the same sort of problem the last week I made a lvlib that contained only two classes (no inheritance or anything complicated). I'm terrible at moving stuff around on disk outside of a project to get it "organised". I then close the project, re-open and let labview find or ask me where things are and let it re-link. LV is usually pretty good and without classes this has never been a problem. In this instance it just wouldn't forget an old class name that I had used previously and a couple of VIs were still linked to it-apparently. It only showed up when trying to build a package where it would show the finding dialogue and report an error of xxx.lvclass not found and the build would fail. There was no indication in the project that anything was a-miss (no broken VIs and nothing in the dependencies). Eventually, I searched all the files with a contents search to find what was linking to what (no other way of knowing) and once I found out which ones; disconnected them from library and re-added/re-saved. This fixed the problem but it took some hours to find and my bald spot now covers the rest of my head with all the scratching. . I have vague recollections of a thread on here about similar behavior being reported, then fixed, then coming back again.
  13. You must be using LV2011 or below. I came to the conclusion (rightly or wrongly) that dylib support was an incremental development on the Mac with the dialogue one step behind. LV2010 and below cannot use dylibs at all (the library won't load). In Labview 2011, you cannot choose them from the drop down in the CLFN (must be Framework), but they do work, So if you create the call in 2012 and back-save or type the path and function name in;, it will be fine (the library will load). 2012 and later are fine and you can select them in the drop-down..
  14. Wouldn't have kept me out
  15. Most of these I think could be alleviated just by turning on the alignment grid and "snap to".
  16. Too many hours on Linux Double-click will highlight the entire wire and show all kinks, even if behind a primitive Wouldn't be any worse than using classes They've already put in optimisations to increase compile times and code base by 500% for them
  17. Try running the Static Code Analyzer. Any code is crap according to that but it is good for finding wire bends or wires backed up/bent under controls and icons.
  18. ไม่หึง.ไม่รัก
  19. I wouldn't. I'd just take rate of change (dy/dx) of the Vrms and if it went outside a value, count the event (assuming it's a bit noisy and not zero when unchanging).
  20. Use the AC & DC Estimator VI to measure Vrms and just note when Vrms changes.
  21. Came across a nasty little issue today. It seems that the Mouse Events are not backwards compatible between 2009 and subsequent versions. Following is a link to a zip file that contains two VIs. One is saved in LV2009, the other in 2011. If you open the LV2009 version in any later version you get a broken wire. The 2011 version, however is OK when opened in 2012/13. If you back-save (Save For Previous Version) from a later version to 2009, you also get a broken registration reference wire when loading back up in 2009 . http://www.labview-tools.com/download_file/119/309/ .
  22. That's disappointing. I've nearly finished my MDI API
  23. Well. My list doesn't just include stability. It also includes performance (both exe and IDE). There have been only 2 that have been exemplary (too long ago to remember exactly before 7.0, but started at around 2.x and have no bad impressions). Only labview before 8.x was bullet proof IMHO and even as far back as 2.x it was rare to get an Insane Object (the equivalent of a GPF in the more modern versions) which I experienced 2 or 3 times a year.. So. Outstanding editions: 7.x 2009 There have been some real dogs too 8.x (yes all of them!) 2010 That leaves high project risk but maybe worth it for the features or if specific fixes address business critical problems you have experienced (they didn't for me).. 2011 ("Performance and stability release" was a joke, and I said so at the time) 2012 So on to 2013. My experience so far is that it is excellent and, if it didn't have the JSON issue or it had a a few more new events, I would have upgraded from 2009 when SP1 comes out. The issue for me is that when installing the latest DAQ it states that it will bork 2009 DAQ functions and I'm not prepared to do that. So whilst I might have upgraded to LV2013, I am now and will be forever stuck at the current version of DAQ until I abandon 2009..
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.