Jump to content

Bhaskar Peesapati

Members
  • Posts

    5
  • Joined

  • Last visited

Posts posted by Bhaskar Peesapati

  1. 7 hours ago, LogMAN said:

    This will force your consumers to execute sequentially, because only one of them gets to access the DVR at any given time, which is similar to connecting VIs with wires.
    You could enable Allow Parallel Read-only Access, so all consumers can access it at the same time, but then there will be could be multiple data copies.

    Have you considered sequentially processing?

    Each consumer could pass the data to the next consumer when it is done. That way each consumer acts as a producer for the next consumer until there is no more consumer.
    It won't change the amount of memory required, but at least the data isn't quadrupled and you can get rid of those DVRs (seriously, they will hunt you eventually).

    Looks like you do not like to use DVRs. I read and write to DVR as property in a class in the following way only. Do you still think I am might run into problems.  If so what do you suspect. The write or SET happens only in open place.

    image.png.d037b33f337ba3f2e714d2b6a91f01e6.png

  2. 5 hours ago, LogMAN said:

    Doesn't that require the producer to know about its consumers?

    You don't. Previewing queues is a lossy operation. If you want lossless data transfer to multiple concurrent consumers, you need multiple queues or a single user event that is used by multiple consumers.

    If the data loop dequeues an element before the UI loop could get a copy, the UI loop will simply have to wait for the next element. This shouldn't matter as long as the producer adds new elements fast enough. Note that I'm assuming a sampling rate of >100 Hz with a UI update rate somewhere between 10..20 Hz. For anything slower, previewing queue elements is not an optimal solution.

    From this discussion I understand this. My requirement is that I process all data without loss. This means I need to have multiple queues. That means 12 MB * # of of consumers is the data storage requirement. With 1 DVR that will be 12 MB storage plus 1 byte to indicate to consuming events that new data is available. It is true as you mentioned that if a consumer does not process data fast enough (like in a logging process) it could consume up memory that  I need to guard against. Even though EVENTS are not designed for data processing, the data storage requirement does not increase with consumers. All consumers will listen only for 'NeDataAvailable' trigger and all of the events can read from the same DVR. I appreciate any corrections to my thinking. Thanks

  3. Thankyou very much for your reply. The Boolean is wired to stop terminal only indicate to terminate. The data is being written to tdms files. In reality the different loops are independent. Essentially what your are saying is only one loop dequeues and other loops only examine the element and perform necessary operations on it. That way only one queue is necessary. The data block is in a DVR so there is only one copy the data block. In the case of events, The producer triggers the event to indicate new data is present and the consumers process the data. Is there a drawback in this. Again, I thank you very much for your reply.  

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.