Jump to content

Taylorh140

Members
  • Content Count

    49
  • Joined

  • Last visited

  • Days Won

    3

Taylorh140 last won the day on March 31

Taylorh140 had the most liked content!

Community Reputation

6

About Taylorh140

  • Rank
    More Active

Profile Information

  • Gender
    Male
  • Location
    Manhattan, KS

LabVIEW Information

  • Version
    LabVIEW 2016
  • Since
    2010

Recent Profile Visitors

1,242 profile views
  1. Nice simple find! Thanks for the tip.
  2. @X___ So.. the article is a little misleading in its description of epsilon. For a single float the first step from 0 to the next value is 1.401E-45 which is much smaller than epsilon (even for a double) In reality epsilon just represents a guaranteed step size minimum you can expect between the range of 0-1. Its calculated by getting the step size after one: I know that it doesn't count for larger values from experience. If you add one too a double it will only increment for a while before it cant represent the next gap. But I was curious what the epsilon was too. So hopefully that helps.
  3. So I put something together. It implements NextUp and NextDown. I was thinking it would be nice to have a approximation compare that took a number of up/down steps for tolerance. Let me know if you think there is any interest. https://github.com/taylorh140/LabVIEW-Float-Utility If your curious please check it out, and make sure I don't have any hidden Bugs. 😁
  4. So I have a serious question. Does LabVIEW have a way (built in or library) of calculating the next larger or next smaller floating point value. C standard library "math.h" has two functions: nextafter, and nexttowards. I have put together a c function: that seems to do the trick nicely for single floats (well only for stepup): #include <math.h> #include <stdint.h> uint32_t nextUpFloat(uint32_t v){ uint32_t sign; = v&0x80000000; if (v==0x80000000||v==0){ //For zero and neg zero return 1; } if ((v>=0x7F800000 && v<0x80000000)||(v>0xff800000)){ //Check for Inf and NAN return v; //no higher value in these cases. } sign = v&0x80000000; //Get sign bit v&=0x7FFFFFFF; //strip sign bit if(sign==0){ v++; }else{ v--; } v=v|sign; //re merge sign return v; } I could put this in labVIEW, but these things are tricky and there are some unexpected cases. So its always better to use a reference.
  5. @hooovahh I have to agree, some UI's just do fancy things, and I am a sucker for a good looking UI. Keeping them synchronized with scrolling can definitely be a challenge. (Mouse Move and Wheel Scroll event update like crazy, Do you Limit the maximum events?) Let me know if you find a perfect way to handle scroll synchronization. I've used the picture control in the past to draw run-time configurable controls. It can be a resource hog if not carefully pruned. (I always hear about how the picture control is so fast and lean, but In my experience its only true for simple items with only a few <100 draw instructions.) So many UI things. It is usually the most time consuming part. (for the people and the CPU)
  6. @LogMAN Good basic info. I think perhaps the biggest item on that list against property nodes was: Negatives Required to update the front panel item every single time they are called. But this is the same as a changing value. And Dereferences would be hardly noticeable as a speed impact. I think there is more here than the kb lets on.
  7. So, for me it is not just about response time. but CPU usage. We do development on laptops and keeping things below 10% usually prevent the fan from turning on. for some reason using property nodes to do things is very cpu intensive. It is curious to me that the property node takes so much effort to get things working. Changing property node use (in my case) brought cpu usage from 16-21% down to around 0.2-1.0% I believe the LabVIEW UI is based on an older version of Qt. (i might be wrong), which is a retained mode GUI. I theorize the cost of setting UI properties is that they are expected to be immediately present on the User interface. theorizing again that it could be possible to access a retained property and have much lower cost to this method of UI update. But, ehh old UI's don't change easy, and I'm sure they have good reasons for picking the things they do. The hard part for me is that it is hard to make educated decisions on how to interact with UI elements. Strange caveats exist, like @mcduff explained above. It would be interested if there is somewhere more of the caveats are collected.
  8. So i took, some time to do some research here. I believe my method will have to be reviewed for fairness. So ill upload the two VIs i used. It looks to me that if you have less than 10 elements update on average. than you'll probably get decent performance. but after looking at this i would probably recommend always deferring front panel updates, if managing property nodes in any way. MethodPerformance.vi TestRunner.vi
  9. So, I wanted to get an opinion here? I'm Pulling new values in from a device, and have UI elements change color depending on their value. This is of course done through property nodes. My question is what is better: A. Updating UI element only on value change (checking for value changes) B. Updating all UI elements all at once within a Defer front panel updates segment? ( and not checking for value changes) Thats all.
  10. Could you save for previous verison of LabVIEW, maybe I'm slow but I'm still back in 2017.
  11. Started some work on a simple C code generator for LabVIEW. https://github.com/taylorh140/LabVIEWCGen I was working on some ideas to handle some more complex Items. Had some thought about using https://github.com/rxi/vec To handle strings and other arrays/vectors, I would like to pass these more complex data types by pointer. However I am looking for some ideas as to when to make a copy. I know LabVIEW has some internal item that manages this but I haven't found a way of accessing it quite yet. Let me know if you think this is worth some effort. Or if you have any ideas about the architecture that you think would be helpful. Also i imagine that this will still only ever support a small subset of LabVIEW, the languages are quite a bit different, and i prefer that the code be as clean as possible.
  12. I found some information on how to set the group. Since the input types could change it would be good to auto configure the types based on the inputs to the mixed signal graph. I have a basic configuration VI that should do the trick in most cases. It iterates through the cluster items on the input type and based on the types sets the group by setting the Plot.YScaleIdx of the active plot(it also sets the active plot). It does assume that input arrays will be xy time data in the format: 1D array of (cluster of two elements(1D array of Timestamp),(1D array of double)) //because why wouldn't it? AutoGroupInputs.vi
  13. I am trying to use a mixed signal graph to plot a set of digital data (12 lines) and 4 arrays of analog data in XY format. For the analog data I want each array to be assigned to a different group in the graph, but at runtime they seem to all default to Group 1 (the Pressure graph shown below). I have been able to change them using the legend or the graph properties for each plot, but cannot find a way to do it programmatically. Is there a property node that can be used to set this?
  14. I just installed DCAF through the VIPM. Interestingly enough it uses the NI Current Value Table as a dependency. So maybe there is no worries after all to your initial post.
  15. Is it just me or is NI really bad at actually describing what their products do? I had no Idea that DCAF even fell in the same realm of function as CVT. I thought it was some kind of generic configuration distribution tool thing. I have mixed guess about the variant support, on one hand its nice to know the specific type (of course you could query this with variants), they use variant attributes for the back end so i don't see any reason they the couldn't have included them.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.