Tim_S
-
Posts
873 -
Joined
-
Last visited
-
Days Won
17
Content Type
Profiles
Forums
Downloads
Gallery
Posts posted by Tim_S
-
-
The benefit is that the way an error is handled is defined at runtime when the error is handled - so yes, you could use a subVI, but that would only be helpful if all of your errors are of the same type (or are handled in the same way).
This is part devil's advocate and part for my comprehension... I can see where I have errors that would be displayed differently (say, file not found versus cable not plugged in). Wouldn't I have to know what the error state when coding to select the correct object (assuming using OO)? If so, I don't see the benefit of defining at runtime.
Thinking of some of your code samples, I'm handling the error message upstream of where you would. Handling the error upstream would generate more case statements at the "top" level, so later VIs would not get executed.
Tim
-
I presented at NI-Week 2009 on a couple of paradigms for extending the LabVIEW Error Handling Core (as inspired by all your posting here). Here are the resources from that presentation:
Thanks for posting this here; I wasn't able to make it to NI week , so I'm glad for this post. I'm still processing this and where it can be of value, but here's my thoughts...
I don't use the native error handling (specifically I use the cluster, but not the VIs) as the information provided in the dialog is not useful to the end user, or I need the code to continue running as the system must respond to the error (e.g., stop the test based on a fault occurring).
The first case only has one error, so I create a message and use the 1, 2 or 3 button dialog to display what I need to the operator. I like the idea of having the option to include pictures, links to online manuals, etc., available in the error message dialog. This makes the OO in NEC an interesting concept, though I could accomplish the same thing with a subVI. I can't say I've ever had multiple errors with this type.
I handle the second case through the use of an error daemon, allowing the parallel loop display the error so the generating loop can handle the fallout. The cluster I send to the daemon is not an error cluster (especially since I'm often dealing with multiple language selection, a large, nasty beast in-and-of-itself). This would be where I handle multiple errors as there may not be an operator to click on a dialog box.
Tim
-
Unless you're some uber-architect-God,
Which, of course, I am.
I understand what you meant now. Thanks for the explanation.
By the way: TestStand really isn't *that* difficult to learn.
I worked at the very tail end of a project where the main developer implemented a system in version 1 of TestStand. I recall there being, shall we say, speaking in tongues and gnashing of teeth before the blood oaths were sworn to never speak of this ill again. As such I'm a little leery of working with it, though I have been trying to get some time to give it a serious look-see.
Tim
-
Sure, you can do a lot of stuff from scratch with LabVIEW, but history suggests it will probably all fall apart sometime. Maybe not today, maybe not tomorrow, but soon. Use LabVIEW for the encapsulated tasks that you can't do with TestStand/IVI/VISA/DLLs/SQL/, and TestStand for everything else. I know it's the same message that you've heard many times before, but maybe that's because it's right.
Crelf -
Could you explain this, especially the "fall apart sometime"?
Tim
-
I'm trying to change the machine access list for a VI-server in run (programmatically). I can add clients in run without any problems, but i cant remove clients without restarting the TCP-listener witch would cause all clients connected to get a timeout. Anyone know a way around this?
I've not tried this, but it seems that you may want to perform access such as this programmaticly. You should be able to use the remote address and an internal lookup table to do what you want.
Tim
-
QUOTE (Ramom @ Jun 9 2009, 12:26 PM)
I`m working with a VI of a Sartorius scale, wich generate the wieght a graph (XY Graph). I need to get the derivate of this graph, the instantaneous slop, but i dont know how to do it, someone could help me?.You can do the calculation (slope) yourself or you can go to the Mathmatics->Integration & Differentiation functions subpallet. I'm assuming you have evenly x-axis spaced points.
Tim
-
QUOTE (Rock4 @ May 28 2009, 10:58 AM)
I have now uploaded the vi with the original overall while loop removed. I am still at a loss as to how to get this to run in the sequence that I am looking for, so any pointers in the right direction would be a massive help and very much appreciated. Also thank you for the advice on books, I have just ordered a copy of "Labview for everyone" from Amazon so would hope to be solving little problems like this for myself in the coming weeks.It sounds as if you're having more fundamental generic programming questions then you are having problems with LabVIEW. I think you're best bet is to look at fundamental programming tutorials and speak with someone one-on-one. I'd recommend doing that and read through "LabVIEW for Everyone".
Tim
-
QUOTE (Rock4 @ May 27 2009, 11:54 AM)
Thank you for mentioning you're a student and trying something first.
There's a lot of books out there. I haven't read any, but there is a list at http://sthmac.magnet.fsu.edu/labview/basic_labviewbooks.html. I expect there'll be a few opinions on good books.
Tim
-
QUOTE (jed @ May 27 2009, 12:25 AM)
Is there anyway for me to get the 6220 working with 8 RSEs and 1 DIFF?My inclination is to fix a hardware problem in hardware. The first thing that comes to mine is that a 5B module takes in a differential signal and outputs a NRSE signal. You'd have to reference it to be exactly like your other channels, but that shouldn't be an issue.
Tim
-
QUOTE (Michael Malak @ May 26 2009, 06:26 PM)
I thought maybe I could utilize a splitter since the splitter seems to have nice resizing capabilities, but I then wanted to make the splitter invisible (to invoke it only at run-time) but that seemed to be impossible.You can effectively make a splitter invisible by using the splitter from the Classic pallet and then making it the same color as the background. The Modern pallet splitter will have a 3D effect, the Classic is flat.
Tim
-
QUOTE (Warren @ May 20 2009, 03:18 PM)
My question is how can I take a constant voltage and trun it into somthing meaningful for labview?Typically I would have a digital input in the form of a PCI card, remote I/O, etc., and I would wire the switch to that. Then it would be a matter of communicating to the card/bus and getting the state of the input. This would be a preferred method. You can get a USB based solution for about 100 USD.
You can use a parallel port as a digital input device. A little web search should provide you with the information you need. There's risks to doing this such as blowing out your parallel port or (if you're really unlucky) your motherboard. This does assume your PC has a parallel port.
Tim
-
QUOTE (Gabi1 @ May 16 2009, 07:37 AM)
at one client i have LV8, at another LV 8.2 (computer 1) and LV8.5 (computer 2), the beta version and LV8.5 on my computer. that is becoming too messy with the different applications.Is there a reason you need to install the latest version of the driver? It may benefit you to "standardize" on a version of the driver, LabVIEW, etc.
You may want to look at using virtual machines and performing your development in those, especially if you have different versions of the drivers.
Tim
-
QUOTE (lovemachinez @ May 14 2009, 10:54 AM)
Does Labview can create variable like C language?I want to storage the number into variable to use in my next process. Does it have..?
LabVIEW does not operate the way you are thinking. The data is contained "on the wire". I would recommend looking at shift registers for next-loop operations.
Tim
-
QUOTE (xShadowDanx @ May 7 2009, 10:56 PM)
How can i actually take the measurement accordingly to the period adjustment? For eg. I create 1 more control which is the period(seconds)..and then i want to save the measurement accordingly to how many seconds i want and then it will automatically save. Can it work if i do that?You can wire a control that contains the milliseconds to delay to the Wait primitive. You could also have a control that has seconds (which more operators understand) and then multiply by 1000 to get milliseconds.
Tim
-
QUOTE (Black Pearl @ May 5 2009, 06:36 AM)
We are the software guys in a scientific environment. So it is all a bit chaotic with ever changing requirenments. And we never ever get any specs on how to design our software. This leads to feature creep/scope creep both by internal featureitis and customers new ideas (they are scientists!). The later we are on schedule, to longer the list of features to implement.We've got a similar albatross with our customers... it's called simultaneous engineering (SE).
For anyone not familiar with SE, it's akin trying to build a cart that will sit perfectly level forevermore while the horse is still growing.
There are pieces we know we'll use... calibration, communication to a PLC or remote I/O, error management, etc. These become part of the library. Anything we create gets looked at and we try to see if it's a one-shot item or something we'll re-use. Sometimes what we reuse is surprising.
Getting an initial specification is tough. I like to mock up screens, create a little Powerpoint and write-up of how it'll work and get sign-off on that design. The signature is the critical part as anything after that is a change and is chargable. Paper trails of what changes were desired and approved provides documentation as to why things are more expensive, taking longer, etc. I've heard this refered to as "managing the customer".
Tim
-
QUOTE (esqueci @ May 4 2009, 11:15 AM)
My problem is that now we decided to create DLLs out of the drivers, and when using the function Call Library Function, there`s no way to call a looping dll and continue the execution of my main program as I did before when calling the VI`s. The CLF waits until the dll finishes executing and just after that the main program can continue executing.I've used two ways to solve this issue...
The first is to call a function in the DLL that programmaticaly executes another VI inside the DLL. This can be fairly self-contained and you can use function calls to interact with the parallel loop (including terminating it). A static reference to the looping VI works pretty well.
The second is to use VI server to obtain a reference to a VI inside a DLL. This can be messier and I would recommend avoiding it as you start dealing with different memory spaces.
Tim
-
QUOTE (JFM @ Apr 21 2009, 02:46 PM)
Have you tried to build the rtexe and compile the bit files on your local machine?Do you get any errors during the builds?
/J
I haven't tried compiling yet. I called NI tech support to try and "get my ducks in a row" before trying this. Support put me on hold to discuss my questions and was informed I couldn't compile without the physical unit. Have I been mislead?
Tim
-
I've got a customer that is very remote (how many miles are between Detroit and Brazil?) who needs an update of the code on the cRIO system we provided them as part of the test stand. I can make the updates to the FPGA and RT portions, but tech support tells me I can't compile them without the physical unit. The only cRIO we bought shipped with the test stand. I'd like to be able to compile the update to the cRIO without having an identical system on my desk.
Tim
-
QUOTE (Scooter_X @ Apr 8 2009, 10:49 PM)
I'm working on a project for my LabVIEW class in which we're supposed to design software that senses and outputs a whole bunch of information about a Pinewood Derby race.Thanks for pointing out this is for a class.
A nudge in the "right direction" is that LabVIEW can sort an array of clusters.
Tim
-
I really appreciate the responses and advice. Thanks to everyone.
This has definitely been a challenge in thinking how to structure the application for best performance where there the resources are very limited. At this point I've created three VIs. The first contains the other two. The second performs the clocking, start DAQ, stop DAQ, and passing of data from a global variable (32 element array of U32) over a FIFO to the RT portion. The third VI writes to the global array of U32 from a FPGA I/O node and speed calculations off encoders going to digital inputs. Populating the global with the data and pulling it out at the desired sampling rate seemed the best idea. That was until something started nagging me from, uhm... a decade or so back... about signals theory and sampling. I'm pretty sure I've held onto that text book... time to dust it off!
-
I've been getting ready to design some code for a cRIO system and have a general architecture question. I can see where my application will be broken into separate loops within the FPGA. Some of these loops will interact with each other. I'm anticipating that not all of the loops will reside within the same VI. What have people found to be the "best" way to interact between loops within a FPGA? Functional globals seem undesirable when dealing with single-cycle loops. Queues and notifiers are not available. FPGA Memory appears to be a good choice, though it has special considerations for single-cycle loops and cannot be used with multiple clock domains. FIFOs appear to be best suited for communication between the RT and FPGA. Do global variables become a preferred choice?
An example of what I'm thinking is this: One loop performs an RPM calculation from a digital tachometer coming into a digital input. A second loop measures 7 analog inputs and sends the inputs as well as the speed to the cRIO's RT host. There is, of course, more complication that that, but that is the basics of the FPGA application.
Thanks.
Tim
-
-
-
QUOTE
Another method you can try is placing a breakpoint inside a case structure in your assertion VI. Then, whenever you want to trigger the assert, simply call the case with the BP in it. One problem is that if the BP will be inside the assertion VI, you won't see where the actual error came from, but there are at least two options for this:The breakpoint could be outside of the assertion VI. The assertion VI could output an error cluster which chooses between cases. The error case could contain the breakpoint. Probing at the breakpoint would give you the current error cluster's value and possibly information as to why the assertion is throwing up a red flag.
Tim
Updating the LabVIEW Error Handling Core
in LabVIEW General
Posted
Pardon me as I take off my technician hat and put on my programmer hat...
What I was saying is that I expect a subset of errors to occur. I (currently) have to decipher these error codes to produce a meaningful message to the operator. My statement was as a technician where each error causes a different thing to look for (think state machine).
ErrorHandling Example.vi
Here's a quick example, no OO as I'm just throwing this together. The errors coming out of the file open, write and close will always require the same type of display (in this case, a 'this is what went wrong message'). Perhaps I'm perceiving an example that doesn't lend itself to more advanced error handling.
Wouldn't that be: ?