Jump to content

Wait for multiple queues


Recommended Posts

I'm implementing a universal message passing subsystem to pass messages between several asynchronous parallel modules.

For the communication, I'm using (naturally) queues. However, I need the message passing subsystem to wait for new element in any of several queues. There is no such primitive in LV, only waiting for multiple notifiers. I know I could work around this by using an additional notifier for each message queue, but is there a more elegant solution?

Link to comment

If there is only one recipient, do you really need several queues?

I could think about one queue which is fed by several senders.

If they need different data types, you could use a standard message type for the queue:

A cluster, containing an enum ("command") and a variant ("data").

For each command there can be a different type of cluster in the data part.

If you use this kind of message queue, I am quite confident that you will soon find out many additional ways to make use of this concept. :)

  • Like 2
Link to comment

Yair: The need for polling is one of my worst programming nightmares ;) . I need to pass each message ASAP and I can't use 100% CPU time. I think that's not achievable by polling.

silmaril: Good idea! :lightbulb: I usually use queues with just one sender and one receiver, so I haven't thought about the possibility you propose. The only technical drawback is that each sender has to send its ID together with its message. But that's definitely possible to do.

Link to comment

Yair: The need for polling is one of my worst programming nightmares wink.gif . I need to pass each message ASAP and I can't use 100% CPU time. I think that's not achievable by polling.

That's a common misconception. If you poll the queues with a timeout of 5 ms (or even 1 usually), your CPU consumption would usually not be high and you would get the data quickly (ASAP is a relative term). This of course depends on many factors, but I'm speaking generally.

That said, the other suggestion is better.

Link to comment

I do something similar to this. I have multiple senders posting messages to a receiver. In my case, I wanted the system to be lossy since I only care about the most recent message from any one sender.

I have each sender use a notifier to send it's message. I then wait on multiple notifiers and then process all the notifications at a maximum rate of 1 second. The notifications contain a queue ref from the sender. I filter the array of notification data to throw away dups (the lossy part) and then process each unique queue to get the data it contains (these are single element queues).

I find a mix of queues (used as dynamic memory elements) and notifiers (for signaling) works quite well.

Link to comment

I do something similar to this. I have multiple senders posting messages to a receiver. In my case, I wanted the system to be lossy since I only care about the most recent message from any one sender.

I have each sender use a notifier to send it's message. I then wait on multiple notifiers and then process all the notifications at a maximum rate of 1 second. The notifications contain a queue ref from the sender. I filter the array of notification data to throw away dups (the lossy part) and then process each unique queue to get the data it contains (these are single element queues).

I find a mix of queues (used as dynamic memory elements) and notifiers (for signaling) works quite well.

I don't quite get this. The notifier itself can carry data, so why are you passing a reference to a single-element queue?

Link to comment

I don't quite get this. The notifier itself can carry data, so why are you passing a reference to a single-element queue?

Thats what I get for not reviewing the code before I post. It has been awhile since I built this.

Here is what I am actually doing:

In the receiver VI, I wait on a master queue.

When I get 1 or more elements, I flush the queue. I then filer the elements for duplicates. Each element has an enum (command) and a variant (data). If the command is something like 'quit' then I process it immediatly (and stop the receiver loop). If the command is a new message, then the variant is a notifier refnum unique to the sender. I get the data from the notifier and display the message. I do this for each unique queue element. I then sleep one second and go back to waiting on the queue.

This way, each sender can fill the queue with as many updates as it wants and I only process one of them each second. Since the element contains the ref to the notifier, each queue element from the same sender will be the same and the filter will toss all but one. Then when I process it, the notifier ref resolves to the most recent message only and I get my data.

I suppose now that we have lossy queues I could replace the notifier with a single element lossy queue and get the same effect.

In another part of the code, I use the wait on multiple notifiers to monitor several independent threads. There is one notifier that I use to tell this consumer loop to add a new notifier to the array to wait on. I think of this as a channel. Then that unique notifier channel is given to a thread when it spawns. The thread posts it's messages on that notifier channel and when the thread exits, its last message says to delete the notifier from the array, closing the channel.

This is also a lossy scheme so the the threads can post their status as much as they want and the consumer only updates the GUI once a second with the most recent info.

In the consumer I get all the updates from all the notifiers in the previous 1 second at once and then do a single update to the GUI (with defer updates on) to keep CPU overhead to a minimum.

I hope that makes sense. I'm sure there are other ways to code this but this is working well for me right now.

-John

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.