Jump to content

Queued Message Handler Design Pattern


Recommended Posts

I use the QMH quite often, but only for a small application that requires a state machine or sequence where I need something quick and dirty. For anything medium to large I prefer a typedef enum feeding the queue. One of the big benefits of the enum is that when I can finish a case, and my next case is similar all I have to do is "duplicate case" and I'll have most of my code for the new case. And I don't have to worry about the spelling. :)

Link to comment

I haven't used this particular example of a queued message handler but have used them extensively in my applications. Liek Paul, I prefer to use a typedefed enum to define my states. I do realize that strings are very flexible but using strings can cause a mess overtime because if a state name of message type is changed all code that used that must be updated. When using a typedefed enum everything using it will pick up the change. Of cousrse this does mean that you can have a single re-usable library for your message handler but rather use a template that is customized for each application. For me the additional work required when using the template is worth the effort.

Now if only LabVIEW allowed you to truly typecast an unsigned 16-bit integer to a enum then everything would be great. However I have not found typecasting of integers to enums to work correctly.

Link to comment

QUOTE (normandinf @ Apr 8 2009, 10:37 AM)

No, I will have to give this a try.

QUOTE (neBulus @ Apr 8 2009, 10:52 AM)

As long as the input is U16 is works fine for me. What problem did you observe?

Ben

If you typecast an enum to a u16 and wire it to a case structure you don't get the string values of teh enum so the cases are simply 1,2,3, etc. Therefore you lose the readability. Also when I have typecast a U16 to an enum it always seems to retunr the value of teh first enum rather than the actual value input.

Link to comment

QUOTE (Mark Yedinak @ Apr 8 2009, 12:45 PM)

No, I will have to give this a try.

If you typecast an enum to a u16 and wire it to a case structure you don't get the string values of teh enum so the cases are simply 1,2,3, etc. Therefore you lose the readability. Also when I have typecast a U16 to an enum it always seems to retunr the value of teh first enum rather than the actual value input.

One thing I have found when casting an integer to an ENUM is that if the two are not the same representation (ie both U16 or whatever) then I get the problem you describe (always equals the first item)... So, I woud say double check that both the integer are the exact same data type.

Shaun

Link to comment

QUOTE (Shaun Hayward @ Apr 8 2009, 07:21 PM)

QUOTE (jdunham @ Apr 8 2009, 08:27 PM)

We coerce to and from enums all the time, and it works fine -- IF your enum and the integer have the same numeric size (# bits). Of course enums default to 16-bit and integers default to 32 bit so you almost always have to fix one. Y

The fact that this happens is because Typecasting is not a conversion tool! It only tells LabVIEW to look at the data at memory Address X as type Y instead of type Z.

The fact that it returns the first element of the Enum is expected if you use a U32 as source type for an U16 enum. The first two bytes of the U32 will be used as enum-value, and unless you have a single bit in the upper 2 bytes (>2^16) this will be 0.

Ton

Link to comment

QUOTE (jdunham @ Apr 8 2009, 01:27 PM)

Why would you cast the enum to a U16 and then still expect to see the strings? If you need the strings, leave it as an enum, or convert it to a variant.

The reason I would want to convert the data is when I would define a generic message construct for passing data between processes. The generic subVis would use a cluster for the message type which consists of an U16 and a variant. The subVIs used to actually pass the messages have no need to know anything about the message data or its content. The only part of the application that needs to know what the message type means is the actual source and destination of the message. However it is useful for the message type to be defined as a typedefed enum at the source and destination of the message. So the message type would be treated as an enum at the source and destination but would be treated as a simple U16 within the actual message. I suppose I could define the generic message type as a variant. This would result in cluster for the message data to be two variants.

Link to comment

QUOTE (Mark Yedinak @ Apr 8 2009, 06:33 PM)

I suppose I could define the generic message type as a variant. This would result in cluster for the message data to be two variants.

Something to consider: variant attributes are also variants. So, you could get rid of the cluster, and assign each data element a "type" attribute instead which would contain your enumeration value.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.