Jump to content

LogMAN

Members
  • Content Count

    507
  • Joined

  • Last visited

  • Days Won

    47

LogMAN last won the day on November 27

LogMAN had the most liked content!

Community Reputation

168

3 Followers

About LogMAN

  • Rank
    The 500 club
  • Birthday 04/06/1989

Profile Information

  • Gender
    Male
  • Location
    Germany

LabVIEW Information

  • Version
    LabVIEW 2019
  • Since
    2008

Recent Profile Visitors

5,628 profile views
  1. Make sure that you have the rights to distribute those binaries before you put them in your build specification. There is a license agreement that you accepted when you installed them on your machine. Note that you don't have to distribute those assemblies yourself. Perhaps there is a runtime installer available which your clients can use. As long as the assemblies are installed on the target machine, LabVIEW will either locate them automatically, or you can specify the location in an application configuration file. Here are some resources on how assemblies are located: Loadi
  2. Here is an interesting note: How LabVIEW Locates .NET Assemblies - National Instruments (ni.com)
  3. Edit: Nevermind, I misread your post 😄 In order for LabVIEW to know about all dependencies in a project, it has to traverse all its members. Because this operation is slow, it probably keeps references open for as long as possible. I'm not sure why it would unload the assembly in standalone, but that is how it is.
  4. There is something strange about how this library is managed. For some reason it seems to work if the VI is part of a project, but not as a standalone VI. Standalone As part of a project I did not reproduce the entire example, so it might fail. At least you can try adding your VI to a project and see if it works (make sure the assembly is listed under Dependencies).
  5. I was wondering why it wasn't documented, now I know 😄
  6. 1+3 makes the most sense in my opinion. And it should be limited to a single SQL statement (or multiple statements if they use absolute parameter bindings like "?NNN"). Perhaps it also makes sense to apply that limitation to Execute SQL, because it can get very confusing when executing multiple statements in a single step with parameters that include arrays. For example, what is the expected outcome of this? I could be mislead into thinking that the user name only applies to the first SQL statement and the array to the second, or perhaps that it uses the same user name multiple
  7. I think you already answered your own question. Each statement only affects a single row, which means you have to wrap arrays in loops. Don't forget that Execute SQL can handle multiple statements at once. This is shown in one of the examples: Note that there are multiple statements that utilize the same input cluster. You could use the Array To Cluster function to simplify the task: You are on the right path. In this case Execute SQL will report an error because only arrays of clusters are allowed: Please take the following section with a grain of sa
  8. +1 for flushing the event queue. Here is another solution that involves unregistering and re-registering user events. Whenever the event fires, the event registration is destroyed. At the same time it sets the timeout for the timeout case, which will re-register the event and disable the timeout case again. This could be useful in situations where the consumer runs much (order of magnitudes) slower than the producer, in which case it makes sense to reconstruct the queue every time the consumer desires a new value. I haven't done any benchmarks, nor used it in any real world applica
  9. My consumers also tend to update the whole GUI if it doesn't impact the process negatively (it rarely does). I was looking for a solution that doesn't require each consumer to receive their own copy in order to save memory. But as @Bhaskar Peesapati already clarified, there are multiple consumers that need to work lossless, which changes everything. Discarding events will certainly prevent the event queue to run out of memory. I actually have a project where I decided to use the Actor Framework to separate data processing from UI and which filters UI update messages to keep it at about 10 Hz.
  10. I have only one project that uses multiple DVRs to keep large chunks of data in memory, which are accessed by multiple concurrent processes for various reasons. It works, but it is very difficult to follow the data flow without a chart that explains how the application works. In many cases there are good alternatives that don't require DVRs and which are easier to maintain in the long run. The final decision is yours, of course. I'm not saying that they won't work, you should just be aware of the limitations and feel comfortable using and maintaining them. For sure I'll not encourage you
  11. This will force your consumers to execute sequentially, because only one of them gets to access the DVR at any given time, which is similar to connecting VIs with wires. You could enable Allow Parallel Read-only Access, so all consumers can access it at the same time, but then there will be could be multiple data copies. Have you considered sequentially processing? Each consumer could pass the data to the next consumer when it is done. That way each consumer acts as a producer for the next consumer until there is no more consumer. It won't change the amount of memory required,
  12. Okay, so this is the event-driven producer/consumer design pattern. Perhaps I misunderstood this part: If one consumer runs slower than the producer, the event queue for that particular consumer will eventually fill up all memory. So if the producer had another event for these slow-running consumers, it would need to know about those consumers. At least that was my train of thought 🤷‍♂️😄
  13. Doesn't that require the producer to know about its consumers? You don't. Previewing queues is a lossy operation. If you want lossless data transfer to multiple concurrent consumers, you need multiple queues or a single user event that is used by multiple consumers. If the data loop dequeues an element before the UI loop could get a copy, the UI loop will simply have to wait for the next element. This shouldn't matter as long as the producer adds new elements fast enough. Note that I'm assuming a sampling rate of >100 Hz with a UI update rate somewhere between 10..20 Hz. For an
  14. That is correct. Since the UI loop can run at a different speed, there is no need to send it all data. It can simply look up the current value from the data queue at its own pace without any impact on one of the other loops. How is a DVR useful in this scenario? Unless there are additional wire branches, there is only one copy of the data in memory at all times (except for the data shown to the user). A DVR might actually result in less optimized code. Events are not the right tool for continuous data streaming. It is much more difficult to have one loop run at a d
  15. Welcome to Lava! The loops in both of your examples are connected with Boolean wires, which forces them to execute sequentially. This is certainly not what you want. Also, both examples are variations of the Producer/Consumer design pattern. You probably want to store data lossless, so a queue is fine. Just make sure that the queue does not eat up all of your memory (storing data should be faster than capturing new data). Maybe use TDMS files for efficient data storage. Use events if you have a command-like process with multiple sources (i.e. the UI loop could send a command to the d
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.