Bobillier Posted April 8, 2009 Report Share Posted April 8, 2009 Hi All I have find something that will be very interresting about Queued design Pattern. http://zone.ni.com/devzone/cda/epd/p/id/6091 and there is ogp (vipm) package too Quote Link to comment
PaulG. Posted April 8, 2009 Report Share Posted April 8, 2009 I use the QMH quite often, but only for a small application that requires a state machine or sequence where I need something quick and dirty. For anything medium to large I prefer a typedef enum feeding the queue. One of the big benefits of the enum is that when I can finish a case, and my next case is similar all I have to do is "duplicate case" and I'll have most of my code for the new case. And I don't have to worry about the spelling. Quote Link to comment
Mark Yedinak Posted April 9, 2009 Report Share Posted April 9, 2009 I haven't used this particular example of a queued message handler but have used them extensively in my applications. Liek Paul, I prefer to use a typedefed enum to define my states. I do realize that strings are very flexible but using strings can cause a mess overtime because if a state name of message type is changed all code that used that must be updated. When using a typedefed enum everything using it will pick up the change. Of cousrse this does mean that you can have a single re-usable library for your message handler but rather use a template that is customized for each application. For me the additional work required when using the template is worth the effort. Now if only LabVIEW allowed you to truly typecast an unsigned 16-bit integer to a enum then everything would be great. However I have not found typecasting of integers to enums to work correctly. Quote Link to comment
Francois Normandin Posted April 9, 2009 Report Share Posted April 9, 2009 QUOTE (Mark Yedinak @ Apr 8 2009, 10:46 AM) Now if only LabVIEW allowed you to truly typecast an unsigned 16-bit integer to a enum then everything would be great. However I have not found typecasting of integers to enums to work correctly. Mark, have you tried the OpenG "Coerce to Enum"? http://lavag.org/old_files/monthly_04_2009/post-10515-1239205004.png' target="_blank"> Quote Link to comment
Grampa_of_Oliva_n_Eden Posted April 9, 2009 Report Share Posted April 9, 2009 QUOTE (Mark Yedinak @ Apr 8 2009, 10:46 AM) ...Now if only LabVIEW allowed you to truly typecast an unsigned 16-bit integer to a enum then everything would be great. However I have not found typecasting of integers to enums to work correctly. As long as the input is U16 is works fine for me. What problem did you observe? Ben Quote Link to comment
Mark Yedinak Posted April 9, 2009 Report Share Posted April 9, 2009 QUOTE (normandinf @ Apr 8 2009, 10:37 AM) Mark, have you tried the OpenG "Coerce to Enum"? No, I will have to give this a try. QUOTE (neBulus @ Apr 8 2009, 10:52 AM) As long as the input is U16 is works fine for me. What problem did you observe? Ben If you typecast an enum to a u16 and wire it to a case structure you don't get the string values of teh enum so the cases are simply 1,2,3, etc. Therefore you lose the readability. Also when I have typecast a U16 to an enum it always seems to retunr the value of teh first enum rather than the actual value input. Quote Link to comment
Shaun Hayward Posted April 9, 2009 Report Share Posted April 9, 2009 QUOTE (Mark Yedinak @ Apr 8 2009, 12:45 PM) No, I will have to give this a try.If you typecast an enum to a u16 and wire it to a case structure you don't get the string values of teh enum so the cases are simply 1,2,3, etc. Therefore you lose the readability. Also when I have typecast a U16 to an enum it always seems to retunr the value of teh first enum rather than the actual value input. One thing I have found when casting an integer to an ENUM is that if the two are not the same representation (ie both U16 or whatever) then I get the problem you describe (always equals the first item)... So, I woud say double check that both the integer are the exact same data type. Shaun Quote Link to comment
jdunham Posted April 9, 2009 Report Share Posted April 9, 2009 QUOTE (Mark Yedinak @ Apr 8 2009, 08:45 AM) If you typecast an enum to a u16 and wire it to a case structure you don't get the string values of teh enum so the cases are simply 1,2,3, etc. Therefore you lose the readability. Also when I have typecast a U16 to an enum it always seems to retunr the value of teh first enum rather than the actual value input. We coerce to and from enums all the time, and it works fine -- IF your enum and the integer have the same numeric size (# bits). Of course enums default to 16-bit and integers default to 32 bit so you almost always have to fix one. You don't necessarily have to change the data type, but you should always use the numeric conversion functions. I don't mind the occasional coercion dot, except when using enums it's a sign that something is going wrong. Why would you cast the enum to a U16 and then still expect to see the strings? If you need the strings, leave it as an enum, or convert it to a variant. The OpenG functions are very useful, but also know that these work great. http://lavag.org/old_files/monthly_04_2009/post-1764-1239215198.png' target="_blank"> Quote Link to comment
LAVA 1.0 Content Posted April 9, 2009 Report Share Posted April 9, 2009 QUOTE (Shaun Hayward @ Apr 8 2009, 07:21 PM) One thing I have found when casting an integer to an ENUM is that if the two are not the same representation (ie both U16 or whatever) then I get the problem you describe (always equals the first item)... So, I woud say double check that both the integer are the exact same data type. QUOTE (jdunham @ Apr 8 2009, 08:27 PM) We coerce to and from enums all the time, and it works fine -- IF your enum and the integer have the same numeric size (# bits). Of course enums default to 16-bit and integers default to 32 bit so you almost always have to fix one. Y The fact that this happens is because Typecasting is not a conversion tool! It only tells LabVIEW to look at the data at memory Address X as type Y instead of type Z. The fact that it returns the first element of the Enum is expected if you use a U32 as source type for an U16 enum. The first two bytes of the U32 will be used as enum-value, and unless you have a single bit in the upper 2 bytes (>2^16) this will be 0. Ton Quote Link to comment
Mark Yedinak Posted April 9, 2009 Report Share Posted April 9, 2009 QUOTE (jdunham @ Apr 8 2009, 01:27 PM) Why would you cast the enum to a U16 and then still expect to see the strings? If you need the strings, leave it as an enum, or convert it to a variant. The reason I would want to convert the data is when I would define a generic message construct for passing data between processes. The generic subVis would use a cluster for the message type which consists of an U16 and a variant. The subVIs used to actually pass the messages have no need to know anything about the message data or its content. The only part of the application that needs to know what the message type means is the actual source and destination of the message. However it is useful for the message type to be defined as a typedefed enum at the source and destination of the message. So the message type would be treated as an enum at the source and destination but would be treated as a simple U16 within the actual message. I suppose I could define the generic message type as a variant. This would result in cluster for the message data to be two variants. Quote Link to comment
ned Posted April 10, 2009 Report Share Posted April 10, 2009 QUOTE (Mark Yedinak @ Apr 8 2009, 06:33 PM) I suppose I could define the generic message type as a variant. This would result in cluster for the message data to be two variants. Something to consider: variant attributes are also variants. So, you could get rid of the cluster, and assign each data element a "type" attribute instead which would contain your enumeration value. Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.