Tim_S
-
Posts
873 -
Joined
-
Last visited
-
Days Won
17
Content Type
Profiles
Forums
Downloads
Gallery
Posts posted by Tim_S
-
-
I'm using JSON for application configuration, and serialization for version updating.
-
57 minutes ago, Cat said:
According to the calendar, I'm eligible to retire in 1 year, 1 month, and 1 day. I'm doing the code cleanup for my (so far hypothetical) replacement. Or I just retire, and come back the next day as a contractor for twice what I'm making now. :-)
Isn't that standard operating procedure to do that four or five times?
-
2 minutes ago, Cat said:
I guess I gave up on LabVIEW libraries back when one of my llbs went over 1.44MB and I couldn't fit it on one floppy anymore. :-)
You're using those new-fangled 3.5" things, aren't you.
The new file layout was much better when distributing application after report generation toolkit went to LVOOP. Even if I don't use objects, so many shipping things things I use are LVOOP that I don't check that box.
-
Traditionally I've used SQL Server for the database with built-in LabVIEW database functions. I've had too much "fun" with purchasing a copy of SQL Server for projects leaving the country (woo for the ever-changing international trade regulations ). I've started exploring MaridaDB as an alternative, which also can be set up in ODBC so it's just slight differences in SQL statements that need to worry about.
-
What you ran into is that, by default, LabVIEW 2012 uses .NET 2. There is a significant change to .NET 3 and later that requires different behavior down in the bowels to use. NI has an article on how to specify using later .NET versions. LabVIEW 2015 uses .NET 4 by default (not sure the version where the switch occurs). One option would be to have LabVIEW use .NET 4 as that should be present on Windows 10.
The only other options I can see are
- Have an exe that would run before the installer to verify all the required components are installed.
- Don't use the LabVIEW installer as other installer applications can perform this type of checking
-
4 hours ago, ShaunR said:
You don't need to threshold/ An inflection point is a change in sign of the d2y/d2t from positive to negative or negative to positive.
With perfect data I would agree. With real-world data, not so much. I've had too many cases where the SNR is 3:1 for what trying to measure and all that little electrical noise creates inflection points even after running through filtering. (Had to explain that cheap sensors are not the place to save money on and bubblegum is not a good medium for splicing signal cables.)
5 hours ago, ShaunR said:Lets see who can come up with a "cool" way of detecting sign changes and extracting the points to fit
Got some data in mind?
5 minutes ago, MarkCG said:A very good way to take an N-order derivative of a noisy signal is the Savitski-Golay filter, which is in LabVIEW. Read up on it, you will see it is much better than the naive method of taking the derivative (the simple forward, backward, and central approximations)
I'll have to take a look at that. I've used a moving average filter with good success, which (quickly reading Wikipedia) looks to be a specialized case.
-
Have you tried http://www.ni.com/training/classroom/ ?
- 1
-
Somewhere you will have to create threshold and you will have to decide what is strait for what you are doing. We can draw a line on the ground and have everyone agree that it is a strait line. However the Earth is a sphere, so the surface is curved therefore any line drawn on the ground cannot be strait. The line is therefore strait and not strait depending on your frame of reference.
Using the built-in derivative function may give you too much noise. You may have better luck using taking the slope of points some number of samples from each other. This works better for the sensor data I get.
-
22 minutes ago, ShaunR said:
I don't have opinions. I have mental scars from fixing other peoples code.
Suppose that's better than having mental scars from fixing your own code.
- 1
-
39 minutes ago, ShaunR said:
Or the Muddled Verbose Confuser pattern, as I call it.
That must be the worst pattern ever invented. No-one can agree what should be in what bits. No-one ever puts the right bits in the right place after they have argued about it and everyone ends up with MC, MV or one of the other letter combinations (take your pick). Then every single one of the gazillion files only has one of 3 names (controller, view and model) In theory it's great. In practice it's foot shooting with a bazooka.
Any chance you have a strong opinion on it?
- 1
-
Try searching model view controller (MVC). You are trying to get two loops to communicate, which there are lots of examples of (and even more ways to implement).
-
I use plugins in packed libraries, so not quite what you describe. What I don't see in your post is any debug put into the module program to find out if there is an error occurring. My expectation is the packed library requires something it cannot find which causes an error loading the test panel and makes for a very short application run.
-
Action Engines are still a tool that get dusted off and used. I have several communication plugins (packed libraries) where I needed to put in debug screens which wound up being a simple AE. I could have used a different method of showing the debug information, but it made sense to use an AE.
- 1
-
I used tags with a ControlLogix processor, which is the difference between a Yugo and a Ferrari.
Without tags you would be left with setting up Assemblies in LabVIEW which the PLC communicates to as remote I/O. With the ControlLogix processor the PLC looked at the PC as if it was a valve stack, point I/O, drive, or similar device.
-
In LabVIEW the Help->NI-Industrial Communication for Ethernet/IP has all of the information. There were three VIs I used: open, get/set and close. The important part was using the same assembly ID as configured in the PLC. Otherwise, I just used the shipping examples to test out communication. If you're not worried about speed, and all your data types are simple, then I recommend using tags.
-
Nothing like the fun of a Frankenstein project.
I've gotten a ControlLogix processor to talk Ethernet/IP with LabVIEW using NI's Ethernet/IP driver (http://sine.ni.com/nips/cds/view/p/lang/en/nid/209676) by setting up remote I/O device in the PLC and a matching assembly in LabVIEW. I was able to get tags to work, but they were too slow (7 msec per tag and I had 100+ tags, and writing collections of tags (tag composed of tags) required expert knowledge that is not well published.
The other way I would recommend is to use something like KepServerEX which handles the communication and provides an OPC interface to LabVIEW.
If all else fails (and if cost is an option), the PLC has a RS232 port. If the PC doesn't have a RS232 port, then a USB-to-RS232 is cheap.
-
If you look at the drive specs, there are different communication options.
-
1 hour ago, Neil Pate said:
The best thing about UDP jokes is that I don't care if you get them or not.
Boo. Hiss. Cheer...
-
6 minutes ago, Phillip Brooks said:
Are the client and server both running LabVIEW?
Do you have tftp client or server available on either of the systems? tftp is UDP based, but takes care of the handshaking.
https://en.wikipedia.org/wiki/Trivial_File_Transfer_Protocol
To add to this, there is an implementation in LabVIEW at:
https://decibel.ni.com/content/docs/DOC-8301
-
My work that uses JSON is in 2012. I just started shifting my core software to 2015 (though there are a couple things in 2016 that have caught my attention). I'm not running afoul of any bugs in the current revision, so I don't see an issue for me.
-
5 hours ago, joptimus said:
That gives a total of 11 loops that would all be running in parallel, some idle, some doing something.
Only 11 loops?
Your computer is doing a lot of things at once with just the operating system. The question is really 11 loops that need how many resources. If each loop takes a couple percent CPU usage every 100 msec, then you can have a lot of loops running in parallel. If you have two loops that needs 100% CPU usage for 30 seconds and they have to run at the same time, then you'd have a problem. Loops running in parallel have to "play nice with each other".
-
I expect New Temperature Hysteresis contains a numeric with a label of "New Temperature Limit", at least in the User Events control.
-
XNET can do periodic frames well; I've done this with the PCI card. The database of frames contains the intervals to send them. As I recall loading the database was all that was needed. Setting the frame values changed what was being periodically sent.
- 1
-
A google search brought up a lot of hits including:
http://www.ni.com/example/31157/en/
http://digital.ni.com/public.nsf/allkb/7A82E998C9225188862575BE0011C369
The first link includes links to related topics.
Turn Key DAS
in Hardware
Posted
The LMS SCADAS page looks to be only hardware. The connectors on the hardware are nice as not all of the C series cards have that (usually terminals or D-sub for the ones I use). There is mention of synchronizing with CAN and video; I've not tried that with cDAQ, so not sure how easy.
Up in the LMS Testing Solutions are some software screenshots, presumably software that works with the LMS SCADAS hardware. These look like pretty basic screens one would use for the applications. There is mention of a networked central data repository, but no indication of how to access the data. There's very little information quickly available.
I'm seeing a system that has a bunch of canned packages that you can use with it. If you live within the canned packages, then this will likely work well. It has a lot of features that make it attractive (immediately checking data via bluetooth, central data repository...). I expect the down sides to this system are the ability to step outside the confines of it and the cost. NI prices aren't cheap, so system cost may be comparable. At the same time, I've never seen Siemens give anything away that doesn't directly lead to buying more Siemens stuff.