-
Posts
147 -
Joined
-
Last visited
-
Days Won
17
Content Type
Profiles
Forums
Downloads
Gallery
Everything posted by MarkCG
-
Hi all, I've been putting remote panels to use recently as user interfaces for control applications running on real-time targets. Some here have said that remote panels should not really be used for anything other than diagnostic type programs, and that large panels with many controls and indicators is asking for trouble. Rather the better way is to run a GUI executable on your laptop /desktop and control the remote application via shared variables/ tcp-messaging schemes / network streams. You need the LV runtime engine to use remote panels. Does host / remote unit send a copy of the front panel to client/web browser upon connecting, then use the run time engine to detect button clicks and send the new control values to the host when changed? And, like wise, update the client front panel image every time an indicator changes? It seems like that's not too inefficient a scheme. So I'm curious exactly in what way can remote can go wrong. What gains would there be front writing a GUI VI to live on your desktop and send messages via a simple tcp/ip scheme or shared vars whenever a button is pressed to the host, and listen for host indicator changes. I'm not exactly sure exactly how much gain you get from all that work, and how it's any different from a remote panel. Unless you have lots of data to stream back and need to use network streams,, or want to share data across the whole network, and need to use shared vars. Seems to me a remote panel is exactly what you need for point-to-point, low update rate, remote user interface, So am I fooling myself here?
-
I have wondered about this too. I use compact fieldpoints (mostly 2120s) and am rather uneasy about using LVOOP on them. I've encountered a number of problems working with LV2011 and cFPs.... i.e. extremely slow deploy times, confirmed by NI, LV crashing when importing cFP .iak files through project explorer, etc. . This makes me that the cFPs are not getting the love they need from NI. I have a particular component that I've implemented using traditional techniques, but would be well suited for refactoring to OO style. I'll use it to test out how LVOOP works with compact fieldpoints
-
*cue "Thus spake Zarathustra"* .... Floating in the void of cyberspace, I am reborn is the OOPChild !!
-
I swear I've done more templating and cutting and pasting in the past 3 days than I have in the past 6 months. How many times did I have to create a VI identical to 10 others, except for which data in the clustosaurus it unbundles... How many times did I need to change something "for the last time" in those ten VIs,and change every one of them subsequently... How many times did I think, ooooh what if could inheritance to change just change the parent class instead... When I was in high school learned OO concepts in C++ in computer science class, and wondered "what's the point of all this "inheritance" crap? Wouldn't it be easier to just ...." and promptly forgot all of it. Oh the callowness of youth !!! .
-
Sort clusters by one element, stable sort? Non-OOP?
MarkCG replied to MarkCG's topic in LabVIEW General
Ah yes, I think the method guentermueller linked to will work just fine, you can sort by any element you want ... first, second, Nth, ignoring other elements. Basically it's method drdjpowell suggested, and altenback used in that post: It will also keep elements of the same key value in order of insertion. Perfect... thank you everybody for the suggestions! -
NI software engineering, advanced arch courses worthwhile?
MarkCG replied to MarkCG's topic in Certification and Training
I don't know why I assumed that... no offense intended ! -
Hi all, In implementing code for machine control from a set of requirements, I found where a the right data structure could make what I thought would be a somewhat complicated task very simple: A data structure that keeps elements (clusters) sorted by one key element (which will be the priority for me), while at the same time keeping elements with equal key values in the same order that they are inserted (that is to say, the sort algorithm is stable). And allows efficient element insertion / deletion. The structure could allow only unique elements or not, in the latter case I could manage it so that only unique elements are inserted. You can use the "sort 1D array" function of an array of clusters, but it will sort taking EVERY element of the cluster into account. Thus elements get moved that I don't want to get moved. Now Tomi Maila put out this community nuggett back in 2007, where he used variants to implement ordered sets: give it an array of clusters, it sorts them according to value starting with cluster element 0 and keeps only unique elements. http://ni.lithium.co...ant/td-p/489967 This would be perfect! Fast, unique elements, ordered. But again, It sorts based on ALL elements in the cluster, again moving elements that I do not want to be moved relative to each other. I need it to sort only on one key value, the priority. I though priority queue, but how can I easily delete elements by name? I imagine it would involve a lot of queue previewing, dequeueing, and requeuing... Now before anyone starts telling me off about needing to start using LVOOP (which I want to), I have a good reason!! The code will have to be able to run on LVRT 8.6 , which does not support LVOOP. The program will have to run in test facilities on the other side of the globe, and I have no control over whether or not they will or can upgrade. So it pays to be conservative. Anyone have any ideas about what I ought to do? I can always write my own, I suppose.
-
NI software engineering, advanced arch courses worthwhile?
MarkCG replied to MarkCG's topic in Certification and Training
Thanks for all the feedback. It sounds like they are worthwhile, especially if there are hands on exercises and good instructors. I think I will definitely take software engineering and the OO course in person. The advanced architecture seems to be mostly stuff I've been doing already... state machines and QSMs? Yawn. I'll get the course manuals. What I've REALLY wanted to use and understand LVOOP and use more advanced architectures like AQ's Actor Framework. I'd like to experiment more at work to learn new concepts, but the nature of my job is not such that I can do that very easily. So I'm not progressing at the speed I want. I agree that classroom work is no substitute for experience as several of you have expressed. I think this sort of training can be valuable so you can avoid having "Y years of experience, repeated X times" as opposed to "X*Y years of experience". One thing I can say is that obtaining the CLD was one of the best investments I've made. It's gotten me far better and more potential gigs. I like to work contract, and recruiters are non-technical and much happier to seeing a qualification than listening to you talk about about what you did, since they don't have a clue what any of it means anyways . Once you're past the gatekeepers, you can actually talk turkey with the engineering manager. I want to work on my own as a consultant like Val Brown and ShaunR as soon as possible. I think the CLA will go a long way towards that goal. I'm still relatively inexperienced (graduated in 2007) and don't have the professional network and body of work that they do. But the CLA will help me get the gigs where I can work on big projects and develop that experience. -
Hi all, I got my CLD and now I want to get the CLA. But I work as a contractor and can't sweettalk my employer into dropping a 2-3 thousand dollars on training. So I'm thinking of heading down to Austin from Dallas and taking the next "software engineering" and "advanced architectures" and possibly the LVOOP course. However, they are $1200 each (900 for the web version... meh). I want to think carefully about spending my hard earned cash. So I wanted to ask here if anyone had taken this courses and if so, what they thought of them. Did you learn a lot? Were the instructors good? Who paid? I think classroom instruction can be very valuable because you can ask questions far more easily and directly and understand the material faster.. if you have the right instructors. What don't want is some overpriced glorified powerpoint presentation. What are your opinions?
-
Hi all! Yes ned thank you for clearing that up. I was pretty sure RT did not support events associated with "user interface objects, such as VI panels or controls" as it says in the 8.6 help or manual. When AQ said that they are supported in more recent versions, I was a bit confused as I hadn't seen anything about that. But I figured AQ knew something I didn't . I'm definitely on board with writing code with loose coupling in tight cohesion... I modularize as much as possible, still have much to learn however. I remember looking at some NI example architectures for remote machine control. That sounds like what Paul is talking about. Essentially you use TCP/IP to send little string messages from the desktop UI program /host to the message handler loop running on the RT target. Good to know that the Remote panel is not meant for a full application. but it IS pretty nice not to have to write all the "glue" code to link the front panel code to the RT code. I know what to do now... Thanks everybody!
-
Ahhhh, if only I could upgrade... That would make things easier. So the remote front panel, the thing where you create an .HTML file from the main /GUI VI and deploy that in addition to the compiled EXE to the target, and then you access the GUI / remote front panel through a web browser. I guess you mean it's generally not done ? I can see there are the advantages of running a separate viewer application that uses messages to command the embedded / deployed application. Besides the fine-grained control you have over when and how data is transmitted, you can change the viewer UI without redeploying. What other advantages are there?
-
data transfer rate measurment
MarkCG replied to moralyan's topic in Remote Control, Monitoring and the Internet
What protocol are you using to transfer files? I would also be interested in this but more to see how much data is being transferred back and forth to view a remote panel. That's not something you can easily access AFAIK. -
Hi all, As you probably know event structures aren't supported in LV Real-Time. So if you want to take control of the VI in interactive mode or via a Remote you need to go back to the old "read the button" method... maybe you've had to use it if you've had to work with the "base" version of LV ... The program I inherited actually has somewhere in the range of 50-80 front panel controls, grouped into clusters in a tab control (users want to be able to control stuff!!), and it polls about 20 control every second. I'm beginning to think that this might be a bit much for the pockey old compact FieldPoint-2120 it's running on . Occasionally front panel activity will "freeze" the cFP if it's under heavy load . This of course is pretty much terrible. So I'm thinking of using the "wait for front panel activity" to trigger the boolean reads. NI says this might be a good idea: http://zone.ni.com/reference/en-XX/help/371361H-01/lvconcepts/viewing_fp_remote/ I was kind of distraught when I learned I couldn't use event structures in LVRT. This definietely make me feel better. Has anyone used this technique with remote panels? Any caveats?
-
Hi all, I am facing a bit of a dilemma. I've written quite a few programs using one or more QSMs. I usually use the JKI string based state machine. They are perfectly fine for the applications I've so far developed. Those would be mainly UI and data processing for desktop instruments of one kind or another, or control systems that don't have more than one or two modes of operation. Now, however, I need to implement some control algorithms for large machines that will be quite complex. I saw right a way that a typical flat QSM won't cut it... the number of different modes the machine can be in would would make this pretty difficult to handle. I really don't think a flat QSM is good for anything much more complex than something like the National Instruments "Soda Machine" example it likes to use so much. My first thought use the Statechart module, and this would do exactly what I need it to do and make it easy to implements all the various substates, concurrent processes, etc . However, this is tough sell because it means money has to be spent, and that's just something that the powers that be may or may not choose to do, regardless of whether it would save development time and money in the long run. My second thought was to create my own Hierarchical state machine. I would do so using a stringbased state machine, and use tokens to denote substates. I can then have a state with as many substates as I like. It wouldn't be nearly as pretty as the Statechart diagrams and take longer to implement, but it would at least be more easily debuggable than the current setup. Has anyone tried making their own string based HSM, and if so, do you have any advice for me?
-
Hi yall... So I may have to do this in order to get a 0-5V analog out signal up to the 0-10 V input signal required by a variable frequency drive. I was thinking of doing so by means of an op-amp configured to work as a non-inverting amplifier to increase voltage 2X, then put this through a unity-gain "power buffer" just to make sure the signal gets to were it needs to go (over long wires) without a whole lot of noise and without potentially messing up the op-amp. Should /shouldn't I do this? Any advice? (No, I can't just buy a daq card with 0-10 V output. Cmon, that would be too easy...) Regards, Mark Garnett
-
Yes, you're right, memory leak isn't the right term for it. If had had problems, I wouldn't have known it. But sometimes it has to acquire a lot of data, sometimes less. So the amount of memory that's been allocated is governed by the largest data set acquired. Not really a big deal, just doesn't sit well with me.
-
So I'm guessing using queues for applications that run 24 hrs a day is probably a bad idea.... I've written programs for controlling processes that would buffer up a bunch of data in a queue, analyze it, and adjust control parameters on-line then flush the queue. The whole time I'd assumed this memory was freed up, but instead I created a huge memory leak. Awesome . DEfinitely going to go about it some other wat next time....
-
Alrighty, well after after getting approximately nowhere with the NI support on this topic, I've decided that I'm going to try running it on XP embedded anyways. It sounds like people have done it before and NI HAS released a version of the runtime engine for WinXPe. So there probably is a way. I really don't want to buy the DSC module though... I'd be paying for so much functionality I just don't need. If that doesn't work I may try linux, though that may end up being an even bigger headache.... . Thanks for the advice y'all.
-
Hi everyone, I'm new to this forum so let me briefly introduce myself: I'm 26 years old and live in Austin, TX, where I work for a very small start-up. My degree was in Aerospace Eng. I have been doing labview for about two years, mainly doing control, monitoring, and data logging systems for test stands, running on regular windows pcs. The time has come for me to implement a version of one of this systems in a highly reliable, field deployable form. I'm not up to the task of re-writing everything in C or C++ (I have some experience doing this for what was a small labview project, it's tough, and it takes a long time), so I took a look at what my options where: Compact RIO or Compact Fieldpoint: Pros: high reliability, in other words won't crash because of a programming error you didn't make (I hope). Tough hardware Cons: Prohibitively expensive for us. Looks like $3000+ for hardware alone (need controller, analog in, digital in and out), and performance overkill for the application as loop time is 1 Hz (very slow!). Labview RT module adds to the bill... RIO Single Board Pros: hardware not as expensive, $2000 or so, reliability. May make sense in future for larger systems due to high channel count (32 Analog, 100+ DIO) Cons: need Labview RT AND FPGA module! More money... Still performance overkill. NI wants you to sign OEM agrement ?! Rackmount server running running windows pro + daq card Pros: Path of least resistance, server can be had for $500. Cons: Windows crashes. Server hardware not tough (fans break, HD crashes) / heat resistant. These seem to me the "officially supported" options given to me by NI. I don't really like any one of them. So I thought about using this: a fanless "industrial" style PC like this http://www.industria...ms/nse-2010.htm with solid state memory and Windows XP embedded. M-series PCI DAQ. Pros: Windows embedded should crash much less often. Performance matches requirements. Cheaper: $1000 or so for PC, + $ 400 for DAQ + $200 for OS (according to company). Should require no additional software from NI except for app builder. Cons: Windows XP embedded does not seem to be "officially" supported, although there are a few articles on running labview executables on there. This last solution sound best to me, but I haven't been able to find a whole lot of discussions on it on the web. Maybe I didn't look hard enough... What do you think? Have you tried it? Heard of anyone trying it?