Jump to content
News about the LabVIEW Wiki! Read more... ×

ak_nz

Members
  • Content Count

    77
  • Joined

  • Last visited

  • Days Won

    7

ak_nz last won the day on September 22 2016

ak_nz had the most liked content!

Community Reputation

17

About ak_nz

  • Rank
    Very Active

Profile Information

  • Gender
    Not Telling
  • Location
    Good Ol' New Zealand

LabVIEW Information

  • Version
    LabVIEW 2013
  • Since
    2010

Recent Profile Visitors

759 profile views
  1. ak_nz

    .NET type conversions in LabVIEW

    Just a note that the C# code shown by the OP isn't magically interpreted differently by the C# compiler and will throw an exception at run-time just as rolfk says and for the reason he says.
  2. Out of interest it has to do with the version of the run-time CLR rather than the framework (which has the runtime as well as bas libraries etc,). LV2012 and earlier - use CLR 2.0 (which exists in .NET Framework 2.0 - 3.5) LV2013 and later - use CLR 4.0 (which exists in .NET Framework 4.0 and later - 4.6.2 inclusive) You can easily download the .NET Framework 3.5 offline installer and add this install to your installer as an action. You can run the offline installer in "silent mode" so that it installs or bypasses if installation already exists.
  3. My preference would be to have a "Set to Default" method of the class that initialises the object with reasonable defaults that you call on teh startup of your application. Then your settings UI can call methods on the object to tweak teh settings as the user desires. As a general rule I dislike the "Default Value" properties of controls because they can be very hard to control and enforce over the development lifetime of an application.
  4. Long and short your answer is what you suspect - there are several VI server methods that are not implemented in the run-time engine and are thus not available if you build an executable. The same issue crops up in other areas like build automation. The way I get around this is to use the IDE but automate the process via scripting. I have a build VI (happens to be a VI Package) that runs automatically via LabVIEW command line arguments that performs the necessary operations and then quits LabVIEW. Not ideal I know but the only realistic option I have found.
  5. ak_nz

    Using LV with SVN

    Doesn't the toolkit use SharpSVN internally? You could probably manually over-write this assembly with the latest online and chances are it would work.
  6. ak_nz

    Using LV with SVN

    TSVN Toolkit works fine in LV2015. The only issue I have found is that exposing the SVN icon states onto the Project Explorer items is a bit of a network hog - as a project grows increasingly larger (100s of items in the project) then project operations tend to suffer. But other then that I have found it a useful tool with SVN repos, especially when it comes to keeping files on disk and items in projects in sync (eg. re-naming).
  7. There is normally an idea of "interface" or "trait" that allows a class to say to callers that it implements certain behaviour. In standard out of the box LabVIEW, these ideas are only possible via composition rather than by inheritance. The GDS toolkit will allow you to create interfaces in a way but natively LabVIEW only supports single inheritance and has no notion of abstract classes or interfaces that exist only to enforce a contract for behaviour. Don't forget that OOP hierarchies are not about "things" they are about common behaviour in the context of your callers. if you find yourself over-riding methods to make them no-ops then this generally indicates that the child class doesn't really respect the contract and invariants of the base class and there is a potential issue to resolve. This can be difficult to fully achieve in LabVIEW so you often have to make compromises and document use cases for the next developer who follows you. Generally the best rule of thumb is to keep your inheritance hierarchies as small as you can to avoid changes in base classes rippling through your hierarchy. Composition can help reduce dependency coupling but, again - this can be hard to achieve easily in LabVIEW. In your example of a power supply with an additional method - this is a functionality that only pertains to a certain unit and only makes sense for that unit. In other OOP languages the natural thing would be to move the behaviour into another hierarchy and inject it in but in LabVIEW this is labor-intensive. My gut feel in this case would be to move the method down to the base HAL and implement the method in each child class - with the exception of the class that actually understand the request, all others can throw a run-time error ("Request not appropriate for the LAVA Power Supply type"). It's not ideal since you can't statically verify your application code but it is a reasonable compromise. It does also force you to deploy your entire HAL again but that's another story.
  8. Are the .NET assemblies referenced in your project library in the source of the component prior to building a package? I have several components that use .NET assemblies - I have never copied them to the LabVIEW folder when changing source; they have always been in the source project folder or sub-folder. LabVIEW effectively adds this folder to the Fusion search paths when resolving assembly locations.
  9. In this stance your Cat and Dog hierarchies are actually different functionally - one Meows and the other barks. Your find Animal and logic after that presumes to work on any Animal - but Animal doesn't have a bark or meow. This is an LSP violation for the caller that expects an animal to do something specific. Your use case of the API (your example) is that the animal needs to make a sound - bark or meow is irrelevant. In this instance I think it is cleaner to: - Have a Make Sound method in Animal that Cat and Dog base classes over-ride to Meow or Bark. - if it is important to be able to make any Cat meow specifically then you are better off adding a "Make Meow" and "Make bark" to your Cat and dog base classes that is only available to them and their children. This way your clarify to your callers that any animal can make a sound but only cats meow and only dogs bark. The best approach here is probably to favor composition and move sound making into it's own hierarchy of classes that are composed into your animals but that's a whole different story.
  10. ak_nz

    SCC & Libraries

    We use VI Package Manager for this (vip files) and then deploy a single VI Package Configuration file (vipc) for the actual project that contains the correct version of the package dependencies. This way a developer can open the source and deploy the correct version of the packages.
  11. What you describe is exactly how I deal with objects I want to share. Yeah, you have to consider the ramifications of multi-threaded access and managing what occurs in the IPE node (I almost never dynamic dispatch in there to protect against deadlocks in a future derived class) and you have to be sure you have captured your atomic activities either through a DVR or an internal static / non-re-entrant member of some kind. But I only go with DVR route when I know that sharing is an objective; it's never the default.
  12. We can run VI Tester tests in an automated fashion using the API (just like the UTF). You'd need the development environment on your CI machine though. There are plenty of posts around about setting up a CI system with LabVIEW. I'd also like to add that VI Tester will not be deprecated by the UTF. They are tools that target testing in different ways - nether one will supplant the other any time soon.
  13. ak_nz

    Timeout on IPE for DVR?

    Welcome to the wonderful world of references, right? You can see the lack of time-out on the DVR as a mild objection to reference-based designs from R&D. Personally we store the DVR in the private data of the class; the DVR is never exposed to callers. As far as I know there are no in-built VI Analyzer checks, though I can see that it wouldn't be too hard to create your own.
  14. ak_nz

    Timeout on IPE for DVR?

    The best practices we follow is to limit activities in DVRs to setting or retrieving data. We don't use DVRs as a "lock" mechanism to limit access to a resource. If we need such behavior we implement any other locking functionality that has time-out capability such as semaphores, SEQs or the ESF. This has allowed us to avoid a host of issues, such as deadlocks in dynamic dispatch methods between DVR'd classes in a hierarchy.
×

Important Information

By using this site, you agree to our Terms of Use.