Jump to content

Taylorh140

Members
  • Content Count

    40
  • Joined

  • Last visited

  • Days Won

    2

Taylorh140 last won the day on February 18 2018

Taylorh140 had the most liked content!

Community Reputation

5

About Taylorh140

  • Rank
    More Active

Profile Information

  • Gender
    Male
  • Location
    Manhattan, KS

LabVIEW Information

  • Version
    LabVIEW 2016
  • Since
    2010

Recent Profile Visitors

1,021 profile views
  1. Could you save for previous verison of LabVIEW, maybe I'm slow but I'm still back in 2017.
  2. Started some work on a simple C code generator for LabVIEW. https://github.com/taylorh140/LabVIEWCGen I was working on some ideas to handle some more complex Items. Had some thought about using https://github.com/rxi/vec To handle strings and other arrays/vectors, I would like to pass these more complex data types by pointer. However I am looking for some ideas as to when to make a copy. I know LabVIEW has some internal item that manages this but I haven't found a way of accessing it quite yet. Let me know if you think this is worth some effort. Or if you have any ideas about the architecture that you think would be helpful. Also i imagine that this will still only ever support a small subset of LabVIEW, the languages are quite a bit different, and i prefer that the code be as clean as possible.
  3. I found some information on how to set the group. Since the input types could change it would be good to auto configure the types based on the inputs to the mixed signal graph. I have a basic configuration VI that should do the trick in most cases. It iterates through the cluster items on the input type and based on the types sets the group by setting the Plot.YScaleIdx of the active plot(it also sets the active plot). It does assume that input arrays will be xy time data in the format: 1D array of (cluster of two elements(1D array of Timestamp),(1D array of double)) //because why wouldn't it? AutoGroupInputs.vi
  4. I am trying to use a mixed signal graph to plot a set of digital data (12 lines) and 4 arrays of analog data in XY format. For the analog data I want each array to be assigned to a different group in the graph, but at runtime they seem to all default to Group 1 (the Pressure graph shown below). I have been able to change them using the legend or the graph properties for each plot, but cannot find a way to do it programmatically. Is there a property node that can be used to set this?
  5. I just installed DCAF through the VIPM. Interestingly enough it uses the NI Current Value Table as a dependency. So maybe there is no worries after all to your initial post.
  6. Is it just me or is NI really bad at actually describing what their products do? I had no Idea that DCAF even fell in the same realm of function as CVT. I thought it was some kind of generic configuration distribution tool thing. I have mixed guess about the variant support, on one hand its nice to know the specific type (of course you could query this with variants), they use variant attributes for the back end so i don't see any reason they the couldn't have included them.
  7. Actually the fact that the data is a string of commands makes a lot of sense. When I did not use smooth display I saw the individual elements being drawn even when i had Front panel updates defered. Since i Used bunches of static text I decided to only draw it the first time. by using the erase first and setting the value to 2(erase every time) and then to 0(Never erase). I tried to use the 1 setting (erase first time) but it always erased the stuff i wanted to draw the first time. I notice that if i use a sub-VI to do operations like erase first and such that it didn't propagate up a level and the same operations needed to be done on the top level picture control. After those things I went from 30 fps to 120 fps which i find more acceptable. The control only used the Picture Palette drawing functions and looked somewhat like the image attached except with real information:
  8. I have never gotten the performance that I desire out of the 2D picture control. I always think that it should be cheaper than using controls since they don't have to handle user inputs and click events etc. But they always seem to be slower. I was wondering if any of the wizards out there had any 2d picture control performance tips that could help me out? Some things that come to mind as far as questions go: Is the conversion to from pixmap to/from picture costly? Why does the picture control behave poorly when in a Shift register? What do the Erase first settings cost performance wise? Anything you can think of that are bad ideas with picture controls? Anything you can think of that is generally a good idea with picture controls?
  9. I typically view the conditional disable as a different kind of item. Not really that different from a case structure with a global written to the conditional terminal. Run this code or that code depending on the configuration of the system. But code is still run on the production environment. This idea is a bit different because the code is run only on the development environment. The idea is in some ways a very convenient prebuild step. Also it increases consistency. Why do prebuild on pc and vi memory initization on FPGA when you could do it in a uniform fashion. Why not use LabVIEW as its own pre-processor? Value would be for things like reading configuration files from the development PC, that i don't want editable in production. Easily Labeling the front panels SVN REV and build time. Perhaps event reducing startup time for LV apps by preprocessing startup code. To me I feel like this would be a great additional structure. I do appreciate your thoughts on the topic, thanks.
  10. I posted This on the Idea exchange a few days ago. I have been wondering why no-one sees the need for constant generation at compile time. I think this would be especially useful in FPGA, but also for constants that need data from the build environment. Perhaps there is just a better way of doing these types of operations? Let me know what you think?
  11. Yeah, I was actually doing the opposite. Using an array of strings and a value generating a enum at runtime. I was afraid what i was trying to do might be confusing. I might be getting encoded data from a device and to decode it instead of using a cluster (which contains a great deal of useful information apart from the values) using variant attributes. Since the specification is sometimes in a document it is easier to pull from that source as apposed to decoding everything using LV wires and clusters. However when looking at variant attributes especially for enumerations its hard to see what the value actually means. this allows me to have a probe view of the data that looks like the right as apposed to the left.
  12. I So it wasn't so bad to put this together. I hope it ends up being useful! Variant_Scoped_Enum.vi
  13. So sometimes when you do protocol decoding it is convenient to use a variant to store decoded data. For example you might have a 2 bit enumeration that is decoded like this: 00 -> Good 01 -> Bad 10 -> Ugly 11 -> Catastrophic If you cast the value to an enumeration that contains these values before hand you can see them on the variant control. If you use a Ring you will only see a value. I know that the LV flatten to string contains the enumeration strings but the encoding is a bit of a mystery, although it looks like the openg palette has figured some of it out to some degree. But to me it doesn't look like there is any reason i couldn't generate an enum to use inside the scope of a variant. Has anyone done this, or know how to generate the string for this purpose.
  14. I found this gem! Strange that that isn't the recommended setting in the tutorial. http://digital.ni.com/public.nsf/allkb/4A8B626B55B96C248625796000569FA9
  15. I am attempting to use a LabVIEW executable as a background service. I had two good pieces of information from these sources: http://digital.ni.com/public.nsf/allkb/4A8B626B55B96C248625796000569FA9 http://digital.ni.com/public.nsf/allkb/EFEAE56A94A007D586256EF3006E258B And now the window does not flicker or appear really at all, also it doesn't show up on the task bar. However IT STEALS FOCUS from other applications when it starts. This interrupts data entry. I am using some .net code to start up my processes (I was hoping that this would do most of the work for me) I was curious if anyone had any other suggestions?
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.