Jump to content

Is the User Interface threading while FP not called?


Recommended Posts

QUOTE (BrokenArrow @ Apr 29 2008, 10:27 AM)

Hello all,

If a VI is set so that it does not show the FP during a call, does it still take X time to update the indicators?

Thanks,

Richard

It really depends! :rolleyes:

If the front panel is in memory (because it was shown at some time or because you use property nodes for front panel controls inside it) then yes the controls will be updated (but not drawn). Maybe that LabVIEW 8.x added something to this so that a front panel maintains data state even if it's front panel is not in memory but I'm to lazy to check that at the moment.

Rolf Kalbermatter

Link to comment

QUOTE (BrokenArrow @ Apr 29 2008, 10:27 AM)

If a VI is set so that it does not show the FP during a call, does it still take time to update the indicators?

Also, regarding this, has there been a significant change from 5.1 to 8.X regarding how indicators are updated when the panel is not shown?

The behavior hasn't changed in many years.

If the FP is in memory, the ctrls/indicators will update, otherwise they will not update. That much I know is true all the way back to 5.1 and before.

In 8.5, the FP is in memory if a) the window is open b) there's an open control reference to a control on the FP c) there's any implicitly linked property/invoke nodes on the diagram tied to controls on the FP d) the FP is configured in such a way that LV assumes you want to show it, such as setting a VI for dialog appearance or e) the VI has unsaved changes. There may be some other edge cases (the FP of the facade VI of an XControl is always loaded, for example) but those are the main ones. I think the big reasons go as far back as we have VI Server.

Link to comment

QUOTE (Aristos Queue @ Apr 29 2008, 02:22 PM)

In 8.5, the FP is in memory if ... c) there's any implicitly linked property/invoke nodes on the diagram tied to controls on the FP ...

That's a really important one that a lot of people forget. Just becase you're not explicitly showing the UI to the user doesn't mean that you're forcing labVIEW to use the UI...

Link to comment

Thanks for the responses! I need some fodder to explain why a 5.1 to 8.2 conversion runs faster. Getting rid of a lot of globals, and using VISA rather than Serpderv, and the un-called panels (that used to run in the background) are the only things I can think of that programmatically sped things up. It likely has more to do with better folding / compiling since 5.1

Richard

Link to comment

QUOTE (BrokenArrow @ Apr 29 2008, 05:15 PM)

Thanks for the responses! I need some fodder to explain why a 5.1 to 8.2 conversion runs faster. Getting rid of a lot of globals, and using VISA rather than Serpderv, and the un-called panels (that used to run in the background) are the only things I can think of that programmatically sped things up. It likely has more to do with better folding / compiling since 5.1

Richard

Except VISA maybe, that seems like an interesting number of things that might be responsible for at least some part of speed improvements. Of course the LabVIEW compiler got smarter too, so memory copy optimizations could have a significant effect too but that very much depends on the architecture of your application. Hearing that it used lots of globals it's not likely that it's architecture lends itself very much for LabVIEW to use its optimization strengths. But getting rid of globals might have done this trick in two ways. First saving lots of data copies just for the sake of the removed global itself and the necessary changes in architecture might have given LabVIEW a chance to actually use optimization in other places too.

Of course not having run VIs in the background for no good (most likely not being implemented as smart state machines either) certainly will give LabVIEW some leeway too in utilizing the CPU for more useful things instead.

I take it, that you do not compare the execution speed of your LabVIEW 5.1 application on a Pentium 133 with your new LabVIEW 8.2 application on a 2 GHz Dual core ;)

Rolf Kalbermatter

Link to comment

QUOTE (rolfk @ Apr 29 2008, 06:39 PM)

... getting rid of globals might have done this trick in two ways. First saving lots of data copies just for the sake of the removed global itself and the necessary changes in architecture might have given LabVIEW a chance to actually use optimization in other places too.

My thoughts exactly. Many of those globals were being used as locals, and those "variables" were used to avoid shift registers. I think the customer got a lot more than he thinks from the cleanup labor, which he was initially skeptical of.

Link to comment

QUOTE (BrokenArrow @ Apr 29 2008, 08:48 PM)

On a side note, if you maintain any 5.1 code, keep a machine with 8.2 or older on it. 8.5 no likie 5.1.

I do know. Just upgraded an application (which I didn't ever see before) from 4.0.2 to 8.5 about one month ago. And I have installed every LabVIEW version since 5.1 up to 8.5 and a bit more on my development computer :P

Rolf Kalbermatter

Link to comment

QUOTE (BrokenArrow @ Apr 30 2008, 01:52 PM)

Then you need to update your info below Chewbaka, which reads "LV:7.1 ,8.0.1 ,6.1". :D

Aaaahh, but I can't enter more than three :rolleyes: ! And besides installed does not mean I use them, and according to the personal settings page it says 1st, 2nd, 3rd LabVIEW version used.

Rolf Kalbermatter

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.