-
Posts
147 -
Joined
-
Last visited
-
Days Won
17
Content Type
Profiles
Forums
Downloads
Gallery
Everything posted by MarkCG
-
I am really just writing this to put thoughts to paper so it is a meandering post with no real point. In past projects I used a layer of indirection between scan engine and fpga I/O by means of the current variable table, with all I/O scanned and written to the table in one place in the code. This made it easy to substitute value in the CVT with simulated values inputted from a test panel. I abandoned that approach in subsequent projects because it seemed not quite right. You have shared variables, why was I bypassing that entire system, which has a lot of functionality and add ons, for some parallel system ? For projects where I just use the cRIO scan engine, I want to cook up a system where my test panel programmatically takes control of all the cRIOs's I/O via this "force variable" functionality. I'd like to run my the real time program on an cRIO chassis devoid of modules so I can see how the program performs on the actual hardware while testing out logic and functionality thorough manual test panels as well as unit tests. Anyone one else do it this way or try it before?
-
I like many others have wanted to be able to target any Xilinx FPGA with LabVIEW. There are at least two projects that I could have done if this was possible. What these guys have done is impressive. I hope product forces NI and Xilinx to finally officially allow targeting any and all Xilinx FPGAs with NI tools. They could try taking legal action but I think they have more to lose than to gain by that route.
-
Open Source SW and mailicious intent
MarkCG replied to Matthew Rowley's topic in OpenG General Discussions
Who would certify it in any meaningful legal sense? If instead they mean someone, employee or consultant, looks over it and says on paper "I don't see any obvious malicious code in here" that might be feasible, you could open each of the VIs in the packages you will use and inspect each one. I think the license for LabVIEW itself has some kind of verbiage in it that it's all "at your own risk" anyways. That sound like a pretty onerous requirement. -
I would bet that part of it is they see not hiring consultants to set up and write software for the system as a plus as well.
-
I remember seeing Elijah's presentation at some event or another and I thought it was pretty slick. I spent a good amount of time following the actor framework discussions on the NI community group but decided I really didn't need the functionality that was on display there for almost anything I did. I do use HAL but not based on Actor Framework.
-
Yes that kind of made me . I've learned to not expect a whole from the AEs who answer the forums or the phones, especially with performance questions. They are kids fresh out of college who have way less experience with LabVIEW than any of us and I don't think they are allowed to bother the engineers except for real emergencies. To their credit they do sometimes do a really good job diagnosing weird bugs and things like that so I am glad they are there for that.
-
These are the conclusions I came to. MathScript RT seem to me like one of those typical NI products where they had a good thought but never REALLY followed through with it enough to make it REALLY useful. Possibly I'm just ignorant of people who rely on it and use it in their systems. There is a solution for running simulink and mathscript in real time out there it's just not part of the NI toolchain. https://www.speedgoat.ch/ that's what I recommended the customer in question look at.
-
Detect straight lines & their angle on a waveform
MarkCG replied to Ano Ano's topic in LabVIEW General
A very good way to take an N-order derivative of a noisy signal is the Savitski-Golay filter, which is in LabVIEW. Read up on it, you will see it is much better than the naive method of taking the derivative (the simple forward, backward, and central approximations) -
A technique called "cepstral" analysis is often used to find echoes. I don't think LabVIEW has a cepstrum VI that comes with the standard analysis library but MATLAB does http://www.mathworks.com/help/signal/ref/cceps.html#bub3q8y There is cepstral analysis in the adavanced signal processing toolkit http://www.ni.com/example/30654/en/
-
yeah really hoping he won't drive the company into ground. You always hear horror stories about CEOs strip mining the company to boost stock prices in short term and get a nice bonus.
-
That's very good information, thank you! I kind of suspected this was a dead end. There seems to be a lot of potential for higher loop rates on the cRIO and this is an interesting thread related to this subject. http://forums.ni.com/t5/LabVIEW-Embedded/cRIO-Poor-Performance-Where-have-my-MIPS-gone/td-p/3026371
-
I have a client who want to be able to use some of the control system code they developed in matlab/mathscript to do actual control with hardware. The loop rate will be in the 10-20 KHz range. Normally when I do RT development I would implement the control loop on FPGA for this sort of thing. I was thinking that perhaps the Mathscript RT module could be used to run matscript on a cRIO. The I/O would go through the FPGA interface node, not the scan engine as that is limited to about 1000Hz from what I understand. I guess this boils down to two questions. 1. Will the overhead associated with the matscript node for even the simplest script be high enough to utilize the entire CPU at a low rate like 1 kHz? 2. Can I/O be updated at 10-20 KHz through the FPGA I/O read write node? Regards, MarkCG
-
LabVIEW is not recompiling my RT MAIN.vis every time I open one of these projects. I do not have source separated from code either. I wonder if it has to do with VIs shared by both RT and Windows.
-
Yes you can write a program in LabVIEW to communicate to the instrument. Writing instrument drivers is tedious but you have all the tools at your disposal to do so via LabVIEW. The USB port is 99% probability a serial port emulator. When you plug the instrument into your computer it should show up a "COM port" . You the write and read strings to this port with the VIs that under the VISA pallet under instrument control. You have to make sure all your baud rate, parity, and other serial settign are correct and match the specification of the instruments . The company will have some sort of document that lists these commands so get your hands on that and look at the some of the serial comm examples and start using that example VI to write commands to the instrument.
-
That's great! The couple of times I went to NI week I felt like the NI presented sessions were technically extremely weak. I really like the conference fee and lost work time was not worth it for this reason. It may convince me to come next year.
-
You really hit the nail on the head here smarlow!! I wish NI would GET THIS.
-
I don't think I will need to implement any custom logic in FPGA. Having worked with FPGA I only really want to use it if there is something I HAVE to do in it that I can't implement on real time side with a scan-engine synchronized loop. multi-kilohertz control loop rates, signal generation or digital logic. Motion control will be handled far better than what I can implement by purpose built, EtherCAT third party servo controllers. My experience implementing custom FPGA logic on the etherCAT slave is not really worth unless you really need it. That is, it's worth it if you have a physically distributed network and you need FPGA logic AT that physical etherCAT node. Otherwise, for master and slave colocated in same cabinet, I prefer to stick any FPGA code on the master, which can have big FPGA, and just use the slaves as expansion chassis for scanned IO. Thank you for pointing that out the CPU issue. I believe I will get a high performance cRIO then. It's good to know it can be done with multiple third party slaves. I am going to have to analyze what happens during resets, thank your for bringing that KB to my attention. Behavior during reset can be odd for certain device-- the NI 9401 DIO module outputs go HIGH during reset for example. Oh yes, I have the compatibility chart KB bookmarked http://www.ni.com/product-documentation/8136/en/ It's funny you mention UDVs because that is exactly what generated a nearly showstopping bug for me a year ago- CAR was issued http://forums.ni.com/t5/Real-Time-Measurement-and/cRIO-application-works-in-interactive-mode-but-broken-VIs/td-p/3095135 I wonder what the UDV limitation comes from? Perhaps NI will do a refresh on the cRIO EtherCAT chassis soon and put a bigger FPGA on it.
-
Very good! Thank you for pointing me to this article, was not aware of it. The only thing I am wary about is running into a bug that will be complete show stopper, which a risk with hardware configuration that are theoretically possible but no one really uses. Especially when using third party devices.
-
Hi all, I am designing the control system for what will be a fairly large machine and I considering the ways the overall system can be architected. I have created systems with a cRIO master and a single cRIO 9144 EtherCAT slave. However I have not tried using multiple EtherCAT cRIOs and/or 3rd party EtherCAT devices like servocontrollers. Has anyone used more than one EtherCAT slave with a cRIO? How many slaves can you realistically daisy chain off one of the newer "value" cRIOs like the 9068? I know this answer will depend on number of scanned variables and the scan engine period. At 10ms scan engine period, what can you expect?
-
What do you use?
-
LabVIEW is not dead yet but it definitely feels like it's getting ready to move to retirement community. I am skeptical that the redesign that comes out in the future will do anything to reverse the trend. I also just think that the closed and proprietary nature of LabVIEW will grow into a heavier and heavier millstone around its neck. You can argue all you want about how in certain situations LabVIEW can save money by reducing development time and increasing productivity. The thing is IME people don't think in those terms. However irrational it is, in my experience it seems that people would rather spend man-hours than dollars, as the man-hours are already paid for and more or less "free" from the manager's point of view. LabVIEW will hang on in certain niches and legacy systems but the dream of "LabVIEW everywhere" to me seems dead.
-
It sounds like you don't need custom FPGA functionality. Programming with the scan engine interface is easy, and I personally thinks it's even easier than NI-DAQmx, where you need to screw around with configuring tasks and channels. With scan engine interface, you just drag the I/O node onto your block diagram from the project and read or write from it.
-
if you want to run a cdaq chassis, why not jsut get one of these http://sine.ni.com/nips/cds/view/p/lang/en/nid/212698 it combines the industrial PC and cDAQ into one chassis
-
Agree drjdpowell. I went to the trouble of figuring out to create having N instances of an executable opening a pair of network streams to remote targets, handling disconnects gracefully on both ends, and figuring out all the error codes associated with them. In hindsight I should have used the simple tcp messaging library. Supposedly network streams are goodn or high data throughput of one data type but how often does anyone really need that?