Jump to content

Taylorh140

Members
  • Content Count

    59
  • Joined

  • Last visited

  • Days Won

    4

Everything posted by Taylorh140

  1. So the way the map works is we can iterate all elements in the map by indexing on the input of a for or a while. However, This only works for a non-changing Map. if you are adding elements during the loop is there a good way of iterating all non-iterated items? Id like to have a version without a duplicate structure: I have these: This might be the best that can be done. but I thought id ask.
  2. @jacobson I agree that is a way to use it. but it seems like a waste of potential leaving it at just the wall of text. especially since the c++ guys can get fancy profiling tools like this one: https://github.com/wolfpld/tracy (so pretty) And i think I'm not the only one that would write their own tool to present the data if it we could get of all the necessary information. It is just a little disappointing. I became a squeaky wheel: https://forums.ni.com/t5/LabVIEW-Idea-Exchange/Desktop-Execution-Trace-Toolkit-should-be-able-to-present-data/idi-p/4046489 Y'all are welcome to join me.
  3. Ahh, This is too bad. I wasn't aware of Real-Time Execution Trace Toolkit. I Guess the title should be more explicit Is there any way to make similar reports using the DETT? (Theoretically?) My eyes always glaze over looking at the wall of text generated from DETT. VI Calls and returns are matched by the tool (Higlighting) but i cant find a way to get this information exported.
  4. I was looking at a user guide for the Desktop Execution Trace Toolkit. And I found this: A pretty graph! ref: http://www.ni.com/pdf/manuals/323738a.pdf I would love to see this kind of detail on my applications but i cant figure out if it was a dropped feature or what the deal is. It sounds like it should just work by default. Maybe its just for real-time targets (lame)? I cant even export enough information from the DETT to redraw a graph like this. But many tools have this kind of execution time view, if it is part of the tool how to i access it and if not how could i get something similar?
  5. Is this the trick where you have controls off-screen waiting to be moved into place? graphs are limited by the number of plots ready? I've also seen the setup where windows are embedded into the UI and reentrant VIs are use to plot. (setup is pretty non-trival though). I haven't really seen a solution that is nice and modular and easy to drop into particular ui being worked on. Is there another?
  6. For a final Case. Sadly there isn't any non-depreciated Items to replace that vi. Which makes this work for Clusterzilla. ArrayToCluster.vim
  7. So, sometimes when i'm troubleshooting or perhaps the customer doesn't know what they want, its nice to make things a bit more flexible. However I find this difficult with plots, they take a good deal of setup and are pretty hard add remove things flexibly during runtime. I'm curious of any other examples or demonstrations/recommendations for doing similar things. I Made an example here of something that i'm working on for flexible runtime plotting: Plotter.vi Front Panel on PaintGraph.lvproj_My Computer _ 2020-04-22 11-04-10.mp4 Here i have some functions to be desired. I am using the picture control so i can add and change plots quickly. But it means more feature implemenation. I am actually forcing the start of the x axis to be aligned (visually not numerically). I still need to clear items and stack plots using colors and stuff, but its all doable during runtime. I thought I'd ask because its been something i have been looking into for a while, and i haven't found a solution that i like yet. Thanks, in advance.
  8. I think i knew about that one. Its quite effective, I just have a natural avoidance of using them in anything production related due to the (not supported stigma). The also use quite a few files, I wish LabVIEW could used fewer files. But mostly I wanted to take a renewed crack at this problem. I used @drjdpowell's solution so I wasn't doing anything unsupported (it should be available to anyone using 2019). (because of the type specialization) Custer To Array: (pass the type directly if possible) otherwise make them a array of variants. Array To Cluster: Get the cluster type and convert it to the proper cluster type. Pretty much this for 256 Cases . And Two special cases. I Suppose i want the Array to cluster and cluster to arrays to behave more like these by default. (its essentially a behavioral extension of the standard functions)
  9. Here is another shot Late to the game i guess Its a 2019 vim that supports cluster size from 1 to 256. Contains forward and reverse operations. ArrayToCluster.vim ClusterToArray.vim
  10. So there are so many ways to do evaluators. I was trying to minimize mess on the implementation (no recursion or any such nonsense) Single File capible(if function VI is removed) Contains tools for changing rules.(In the form of a spreadsheet) It uses variant attributes for the variable name space. Here it is: Made a nice and simple evaluator should be compatible with older versions of LabVIEW as well. And even gave it some test cases: Also check out the github page: https://github.com/taylorh140/LabVIEW_Eval I've attached the Eval and Functions.vi should be LV 8.0 compatible. Functions.vi Eval.vi
  11. Nice simple find! Thanks for the tip.
  12. @X___ So.. the article is a little misleading in its description of epsilon. For a single float the first step from 0 to the next value is 1.401E-45 which is much smaller than epsilon (even for a double) In reality epsilon just represents a guaranteed step size minimum you can expect between the range of 0-1. Its calculated by getting the step size after one: I know that it doesn't count for larger values from experience. If you add one too a double it will only increment for a while before it cant represent the next gap. But I was curious what the epsilon was too. So hopefully that helps.
  13. So I put something together. It implements NextUp and NextDown. I was thinking it would be nice to have a approximation compare that took a number of up/down steps for tolerance. Let me know if you think there is any interest. https://github.com/taylorh140/LabVIEW-Float-Utility If your curious please check it out, and make sure I don't have any hidden Bugs. 😁
  14. So I have a serious question. Does LabVIEW have a way (built in or library) of calculating the next larger or next smaller floating point value. C standard library "math.h" has two functions: nextafter, and nexttowards. I have put together a c function: that seems to do the trick nicely for single floats (well only for stepup): #include <math.h> #include <stdint.h> uint32_t nextUpFloat(uint32_t v){ uint32_t sign; = v&0x80000000; if (v==0x80000000||v==0){ //For zero and neg zero return 1; } if ((v>=0x7F800000 && v<0x80000000)||(v>0xff800000)){ //Check for Inf and NAN return v; //no higher value in these cases. } sign = v&0x80000000; //Get sign bit v&=0x7FFFFFFF; //strip sign bit if(sign==0){ v++; }else{ v--; } v=v|sign; //re merge sign return v; } I could put this in labVIEW, but these things are tricky and there are some unexpected cases. So its always better to use a reference.
  15. @hooovahh I have to agree, some UI's just do fancy things, and I am a sucker for a good looking UI. Keeping them synchronized with scrolling can definitely be a challenge. (Mouse Move and Wheel Scroll event update like crazy, Do you Limit the maximum events?) Let me know if you find a perfect way to handle scroll synchronization. I've used the picture control in the past to draw run-time configurable controls. It can be a resource hog if not carefully pruned. (I always hear about how the picture control is so fast and lean, but In my experience its only true for simple items with only a few <100 draw instructions.) So many UI things. It is usually the most time consuming part. (for the people and the CPU)
  16. @LogMAN Good basic info. I think perhaps the biggest item on that list against property nodes was: Negatives Required to update the front panel item every single time they are called. But this is the same as a changing value. And Dereferences would be hardly noticeable as a speed impact. I think there is more here than the kb lets on.
  17. So, for me it is not just about response time. but CPU usage. We do development on laptops and keeping things below 10% usually prevent the fan from turning on. for some reason using property nodes to do things is very cpu intensive. It is curious to me that the property node takes so much effort to get things working. Changing property node use (in my case) brought cpu usage from 16-21% down to around 0.2-1.0% I believe the LabVIEW UI is based on an older version of Qt. (i might be wrong), which is a retained mode GUI. I theorize the cost of setting UI properties is that they are expected to be immediately present on the User interface. theorizing again that it could be possible to access a retained property and have much lower cost to this method of UI update. But, ehh old UI's don't change easy, and I'm sure they have good reasons for picking the things they do. The hard part for me is that it is hard to make educated decisions on how to interact with UI elements. Strange caveats exist, like @mcduff explained above. It would be interested if there is somewhere more of the caveats are collected.
  18. So i took, some time to do some research here. I believe my method will have to be reviewed for fairness. So ill upload the two VIs i used. It looks to me that if you have less than 10 elements update on average. than you'll probably get decent performance. but after looking at this i would probably recommend always deferring front panel updates, if managing property nodes in any way. MethodPerformance.vi TestRunner.vi
  19. So, I wanted to get an opinion here? I'm Pulling new values in from a device, and have UI elements change color depending on their value. This is of course done through property nodes. My question is what is better: A. Updating UI element only on value change (checking for value changes) B. Updating all UI elements all at once within a Defer front panel updates segment? ( and not checking for value changes) Thats all.
  20. Could you save for previous verison of LabVIEW, maybe I'm slow but I'm still back in 2017.
  21. Started some work on a simple C code generator for LabVIEW. https://github.com/taylorh140/LabVIEWCGen I was working on some ideas to handle some more complex Items. Had some thought about using https://github.com/rxi/vec To handle strings and other arrays/vectors, I would like to pass these more complex data types by pointer. However I am looking for some ideas as to when to make a copy. I know LabVIEW has some internal item that manages this but I haven't found a way of accessing it quite yet. Let me know if you think this is worth some effort. Or if you have any ideas about the architecture that you think would be helpful. Also i imagine that this will still only ever support a small subset of LabVIEW, the languages are quite a bit different, and i prefer that the code be as clean as possible.
  22. I found some information on how to set the group. Since the input types could change it would be good to auto configure the types based on the inputs to the mixed signal graph. I have a basic configuration VI that should do the trick in most cases. It iterates through the cluster items on the input type and based on the types sets the group by setting the Plot.YScaleIdx of the active plot(it also sets the active plot). It does assume that input arrays will be xy time data in the format: 1D array of (cluster of two elements(1D array of Timestamp),(1D array of double)) //because why wouldn't it? AutoGroupInputs.vi
  23. I am trying to use a mixed signal graph to plot a set of digital data (12 lines) and 4 arrays of analog data in XY format. For the analog data I want each array to be assigned to a different group in the graph, but at runtime they seem to all default to Group 1 (the Pressure graph shown below). I have been able to change them using the legend or the graph properties for each plot, but cannot find a way to do it programmatically. Is there a property node that can be used to set this?
  24. I just installed DCAF through the VIPM. Interestingly enough it uses the NI Current Value Table as a dependency. So maybe there is no worries after all to your initial post.
  25. Is it just me or is NI really bad at actually describing what their products do? I had no Idea that DCAF even fell in the same realm of function as CVT. I thought it was some kind of generic configuration distribution tool thing. I have mixed guess about the variant support, on one hand its nice to know the specific type (of course you could query this with variants), they use variant attributes for the back end so i don't see any reason they the couldn't have included them.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.