Jump to content

Question about Tab control


maybe

Recommended Posts

Here is the question, say, I have a tab control which has

0/ page 1

1/ page 2

2/ page 3

3/ page 4

.

.

.

9/ page 10

I know that the tab control works like enum and I can get the current tab page number and name.

If I want to do this: how many tabs (tab pages) between page 10 and page 2, I can just doing sth like 9 - 1; However, I do not know how can I actually do it in labview. I want to do sth like "page10" - "page2" so that the program knows it is 9 - 1.

To be specific, my actual quesiton is that, I want to calculate the pages between a specific page (say , page 2 in this example) and the current tab page (say, page 10 in this example). Any suggestions?

I only know that I can make an enum constant and give it exactly the same enum data like the tab. And because I know how to get the current tab page number, I can do the calculation. However I have no idea how can I do it without creating an new element but uisng the tab control itself.

Thanks.

Link to comment

An enum/tab can be typecast to an integer. Think of it as an index with an array of strings attached. The enum displays the string value corresponding to the index selected.

If you substract two enums, you'll get the distance between tabs (enum) and substract one to get the number of pages between...

You see the coercion dot (red dot) on the substract icon? That means the enum is typecast to a datatype that is coherent with substract function. You should normally typecast your data yourself to make sure LV does what you think it will do.

post-10515-1219448718.png?width=400

Link to comment

QUOTE (maybe @ Aug 23 2008, 01:33 AM)

I have the similar way like what u posted, but where did u get the enum constant "page 2" ?

Just right-click on the Tab Control terminal and select Create -> Constant

If you plan on changing the names of the tabs in the future, I'd recommend making the tab control a typedef. That way the constants will update with the new values when you change page names.

Link to comment

QUOTE (hfettig @ Aug 23 2008, 11:56 PM)

Just right-click on the Tab Control terminal and select Create -> Constant

If you plan on changing the names of the tabs in the future, I'd recommend making the tab control a typedef. That way the constants will update with the new values when you change page names.

Thx hfettig, that's exactly what I was looking for!!!

:headbang: I tried to find the "Create -> Constant" in front panel and I had nth, so I was lost. I havent tried to look at it from the block diagram.

Link to comment

QUOTE (normandinf @ Aug 22 2008, 11:32 PM)

Of course... :headbang:

Well, convert is really just a wrapper for typecasting...

Hehehe, it's time I took some vacation. Wish me nice weather this week. :rolleyes:

No! Conversion and Typecasting are NOT the same.

Conversion tries to maintain the numveric value. "Tries" because it can't always do that if you try to convert a number into a representation whose range is smaller than the current number. In that case the result is clamped (coerced) to the maximum/minimum possible value for that range.

Typecast maintains the binary representation in memory. This means the numeric representation will in most cases change significantly. In the case of typecasting enums into numerics and vice versa you also have to watch out that both sides use the same number of integer bits (so a 16 bit unsigned/signed integer for instance).

LabVIEWs Typecasting uses internally Big Endian stream representation. So Typecasting an i32 into an U16 enum will normally give you the value corresponding to the first enum entry since the upper most 16 bits of the i32 are likely 0.

Rolf Kalbermatter

Link to comment

QUOTE (rolfk @ Aug 26 2008, 02:34 AM)

No! Conversion and Typecasting are NOT the same.

Rolf Kalbermatter

In addition, Typecasting is much slower since it implicitly flattens the source data to string, then unflattens it into the target data type. That's a relatively slow operation, which I've found out the hard way. It's not really a big deal until you start doing it thousands of times in a loop, however.

Link to comment

QUOTE (ragglefrock @ Aug 26 2008, 01:07 PM)

In addition, Typecasting is much slower since it implicitly flattens the source data to string, then unflattens it into the target data type. That's a relatively slow operation, which I've found out the hard way. It's not really a big deal until you start doing it thousands of times in a loop, however.

You can typecast from any data into any datatype as long as both are flat. And there shouldn't really be a huge overhead by typecast. Since the memory stays usually the same. Its more a thing of the wire type (color) changing than anything else and that is an edit time operation, not a runtime one.

For instance typecasting an uint16 enum into an int16 integer should not involve any data copying (unless the incoming wire is used somewhere else also inplace, but that is a simply dataflow requirement not something specific to typecast. If the memory size is not the same then yes there will have to be some data copying. But typecasting a 1D array into a string should really not cause a memory copy. The same memory area can be used and only its compiler type def is changed and the array length indicator adapted to indicate the size in bytes instead of in array elements.

Rolf Kalbermatter

Link to comment

QUOTE (rolfk @ Aug 27 2008, 04:53 PM)

You can typecast from any data into any datatype as long as both are flat. And there shouldn't really be a huge overhead by typecast. Since the memory stays usually the same. Its more a thing of the wire type (color) changing than anything else and that is an edit time operation, not a runtime one.

For instance typecasting an uint16 enum into an int16 integer should not involve any data copying (unless the incoming wire is used somewhere else also inplace, but that is a simply dataflow requirement not something specific to typecast. If the memory size is not the same then yes there will have to be some data copying. But typecasting a 1D array into a string should really not cause a memory copy. The same memory area can be used and only its compiler type def is changed and the array length indicator adapted to indicate the size in bytes instead of in array elements.

Rolf Kalbermatter

Not sure if Typecast is that smart to use data inplace. I actually doubt it. I personally saw significant overhead (~15usec) for typecasting a double into a U64. Same size, but slow performance, especially since I was doing it in a loop. If it was a true c-style cast type operation, it would have taken no time at all. Some functions are smart in the sense you described, such as U8 Array to String and vice-versa. Those are free functions that only have edit time behavior. Type Cast I believe is always a safe run-time function.

I actually ended up writing my own DLL function in C to do a cast from double to U64 to get around the performance hit. The DLL function wasn't inplace and copied the source data over to the destination data, and it was still a lot faster than Type Cast.

Link to comment

QUOTE (ragglefrock @ Aug 28 2008, 05:38 PM)

Not sure if Typecast is that smart to use data inplace. I actually doubt it. I personally saw significant overhead (~15usec) for typecasting a double into a U64. Same size, but slow performance, especially since I was doing it in a loop. If it was a true c-style cast type operation, it would have taken no time at all. Some functions are smart in the sense you described, such as U8 Array to String and vice-versa. Those are free functions that only have edit time behavior. Type Cast I believe is always a safe run-time function.

I actually ended up writing my own DLL function in C to do a cast from double to U64 to get around the performance hit. The DLL function wasn't inplace and copied the source data over to the destination data, and it was still a lot faster than Type Cast.

Several reasons typecast could be slow with that.

1) Typecast is using BigEndian byte ordering internally. You may say now: but these are both numbers and not a bytestream, but for some very strange reasons, the byteordering for floating points in LabVIEW is not following this BigEndian scheme. So as far as I remember it will probably byte shuffle the data too when doing a typecast between integer and floating point. I know it doesn't do the right byte shuffling between byte stream and

2) Floating point values are put in the FPU to operate on them. Maybe Typecast does something unneccessary there since for a mere typecast involvement of the FPU certainly wouldn't be necessary.

3) 64 bit integers is a very recent addition to LabVIEW. If this was with LabVIEW 8.0 or maybe 8.2 it could have been that the typecast operation when 64bit integers were involved was anything but optimal.

Rolf Kalbermatter

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.