
Phillip Brooks
-
Posts
911 -
Joined
-
Last visited
-
Days Won
53
Content Type
Profiles
Forums
Downloads
Gallery
Posts posted by Phillip Brooks
-
-
Has anyone used the splitter bars in LV 8? I'm starting a new project, haven't settled on using LV 8 yet, and was reading about splitter bars. They might add enough jazz to my UI to switch.
http://zone.ni.com/reference/en-XX/help/37...pane_container/
Using Splitter Bars to Create Toolbars and Status Bars
You can use a splitter bar to create a toolbar on a front panel that you can configure with any LabVIEW control. To create a toolbar at the top of a VI, place a horizontal splitter on the front panel near the top of the window, and place a set of controls in the upper pane. Right-click the splitter and select Locked and Splitter Sizing
-
This method works 100% for TCP, and also works for UDP when I put a 1ms Wait Timer inside the send loop.
1 ms is the smallest delay you can place in a loop. Your real application won't be done locally like your test; I suggest that you connect two computers and pass some traffic over a real network segment. There will be delays in your source data and network connection that can't be simulated in the way you are trying.
The best you could try is to use a quotient and remainder function in your send loop, divide the index counter by some multiple and check if the remainder is zero. Include a conditional case that waits 1 ms after every, say, 20 messages to help throttle the send loop in the way real network and data source would.
Otherwise, TCP may be the way to go for you!
-
It is possible my code has some logical error. Maybe you could take a look at it below:
The only thing that catches my eye is that you are comparing the Expected Data Arrival Size (Bytes) which I assume to be static to a value that is increasing every UDP receive (Shift Register + Length of String) :question: . I think you want to connect the equality check to the output of the string length function, not the sum of the lengths read. You could also multiply the interation value by the Expected Data Arrival Size (Bytes) and compare that to your shift register's current value.
-
I think you addressed my new problem (I am losing UDP packets bigger than 8 bytes, or so) ; I think the loop is going too fast. Is that what you meant by "Place the UDP Read function in a tight while loop and pass the output to a queue." ?? I don't really understand what you mean by that paragraph.
Remove any Wait or Wait next multiple calls from your while loop. The UDP demo has a fixed 10 ms wait between reads. Do not specify a max size for your messages. Leave this connector unwired. The OS determines if a UDP datagram is valid, and delivers it to LabVIEW regardless of it's length. If you're sourcing 8 byte UDP messages from another VI, then the receiving VI will receive 8 byte messages.
For the timeout, use whatever is reasonable for you, start with 100 ms? If the timeout occurs, the function will exit and the error cluster will return a code of 56. This simply means that no data was received. Check the code, and ignore it if error = 56. If a UDP datagram arriving on the listening port (regardless of length) passes checksum, the OS will pass it to LabVIEW and labview will immediately exit the Read function. I can't understand why you would be running "too fast" or "dropping messages" considering that the OS buffers UDP messages.
Make sure that you are not closing and re-opening the the UDP session handle after each UDP read. This would likely result in your CPU utilization reaching 100% and datagrams being missed.
I've attached an example that shows the technique I've used successfully. (LV 6.1)
-
My concern is that the results show that TCP is about 5 times faster than UDP transfers! Which is only about half the speed of DLL transfers (i.e. DLL = 0.1ms per transfer, TCP = .2ms per transfer, UDP = 1.0ms per transfer)
Is this reasonable? Any ideas why these results contradict general expectations? Any ideas would be appreciated.
- Philip
The performance difference may be related to how you are sending the packets. If you look at the UDP examples, the UDP Sender has a Broadcast/Remote Host Only boolean. Setting the Address to 0xFFFFFFFF (Boolean true; Broadcast) will force the packets to traverse the hardware onto the wire. Specifying a hostname of localhost will resolve to 127.0.0.1 and the OS will loop the data back before the physical interface.
TCP always includes a source and destination address, so the TCP packets are likely looped back before the physical interface (in the OS). You should really try to do these tests with two computers and distinct IP addresses.
Look at the TX/RX LEDs on your Ethernet controller, and listen for people complaining that the network is slow when you're performing UDP tests. I managed to knock some people out of their database while I was testing my UDP implimentation
If you're setting the UDP packet max size to 64, this could be also be a problem. Leave this variable unwired. From the online help: "max size is the maximum number of bytes to read. The default is 548. Windows If you wire a value other than 548 to this input, Windows might return an error because the function cannot read fewer bytes than are in a packet."
UDP datagrams have a header that indcates the data size. If the data received by the OS does not match this, the packet is invalid and will never be passed to LabVIEW.
Place the UDP Read function in a tight while loop and pass the output to a queue. As soon as the data is stuffed into the queue, the while loop will try to retrieve the next message. The OS can buffer received UDP messages. As an example, try setting the UDP Sender example VI to 1000 messages and change the diagram's wait to 1 ms. Change the UDP Receiver example VI to 10 ms on the block digram. Run the receiver, then the sender.
Note! Don't open and close the UDP Socket between reads or you will thrash the OS! Open the socket, place the handle ID in a shift register, and then close the handle outside the while loop. To avoid memory hogging, set an upper limit for the number of elements in the queue based on your expected receive rate and the interval that you intend to process the data.
I've successfully read UDP messages twice your size at a 400 uSec rate. The data included a U32 counter to monitor for dropped messages. On a dedicated segment, I never experienced a missed UDP message.
-
From INFO-LabVIEW
Could you insert Excel sheets using OLE into the Word document? It might be easier to use named cells to retrieve the values, and if the limits are based on calculations, then the updating of the Word document could be "automatic".
Using Excel could separate the data from the presentation and still meet your requirements for Word.
For simple example Excel and Word files, see http://forums.lavausergroup.org/index.php?showtopic=2533
NOTE: Someone posted an Info-LabVIEW reference to an ActiveX example of Read from Excel. I've edited this vi and combined it with the example Excel and Word doc. The attachement has been updated...
-----Original Message-----
From: Info-LabVIEW@labview.nhmfl.gov [mailto:Info-LabVIEW@labview.nhmfl.gov] On Behalf Of Seifert, George
Sent: Wednesday, January 18, 2006 8:47 AM
To: Info-LabVIEW@labview.nhmfl.gov
Subject: RE: Linking to specs in a Word doc
The reason we're using a Word doc is because that is how all of our top level docs are stored. The docs have to be approved and archived. The procedure is pretty well cast in stone. So I thought rather than copy by hand or copy and paste from the Word doc I would create a routine to either pull the specs from the doc at runtime or pull the specs and stuff them in an ini file. Most likely I will do the latter whenever updates are made to the Word doc. We have 20 or so test stations so I need to make sure they all have the same spec limits to work with. Updating them by hand isn't a good option.
What I think I'll do is require the first row in the table be an easily identifiable marker. From there it's easy to grab the contents of each table and if the marker matches what I'm looking for then I can use the contents of the table.
-
We were about to integrate I/O over parallel port, but it seems that the latest pc's doesn't have onboard parallel ports anymore.
Since we would have it as easy as possible for the end-user to install (also for laptops), we were thinking at an 'USB parallel printer-adapter" and bought one from Belkin.
It works fine for connecting a printer, but I wasn't able to detect it from within LV.
Does anyone have a VI which is capable to read and set I/O lines on a USB printer adapter?
Greetz,
Bart
Why not consider one of the lower cost USB I/O devices on the market? I know the USB to Parallel port adapters are ony ~$30, but there are some nice USB DAQ devices with 8 or more lines of digital I/O for $100 or less that include A/D, D/A and LabVIEW drivers.
I'd look at the LabJack, the EMANT300 or the Hytek iUSBDAQ products.
-
Puzzled though why the others in the project can not
run the 7.1.1 patch? Wouldn't that be more stable/robust
than using 7.1??
Oskar
Mostly an administrative issue. I was the one operating 'outside of the box' and I guess I'll have to revert.
Good thing I had a backup of user.lib and instr.lib before the mass compile! :thumbup:
-
I recently installed the LabVIEW Version 7.1.1 for Windows 2000/NT/XP -- Patch and have been experimenting with dotNET 2.0.
I first installed the 7.1.1 update, mass-compiled, then installed the 'f2' patch. These installs replaced the original files in my 7.1 installation directory. Everything works fine.
Now....
I'm about to start work on a project with others using LV 7.1 from the original CDs installed.
Does the 7.1.1 version change the VI versions? Would my VIs created in 7.1.1 be compatible with 7.1, or will I need to revert. It's not a problem to revert, I'd just like to save myself the trouble....
-
The online CLAD example test is a very good approximation of the actual test.
I've started looking into the CLD test, and noticed a restriction related to your question in the document "Certified LabVIEW Developer Requirements and Conditions".
Confidentiality of Examination Materials: All certification examination materials are
National Instruments Confidential. You agree to not disclose either the content or intent
of examination items regardless of your certification status.
NI does offer a CLD Test prep course on line; I'm trying to decide when to take it. I was told at the NI Symposium last fall that anyone who completed the CLAD test before the end of the year would qualify to take this prep course free of charge.
Good luck with the test! :thumbup:
-
Types of people
Those who overuse control references would be control nuts.
Those who use excessive global and local references would be schizophrenic.
Those who place clusters in clusters in clusters would be organizational freaks.
Those who place wires with no more than two vertical and two horizontal legs would be anal retentive.
Those who use LLBs would be just plain crazy.
Did I leave anything out :question:
-
Intelligent Globals = Action Engine?
When I started working at my current job, all the LV'ers kept talking about Action Engines. I Googled, found little or nothing.
The first result while Googling LabVIEW "Action Engine" today ( 10 Jan 06 ) points to my own question to the LAVA Forum about Action Engines! :laugh: The only other place I've seen the term mentioned is in some NI Forum postings by "Ben". I agree that a functional global by definition should simply store and return data. Maybe in the spirit of the discussion and considering LV 8, they should be called UNshared Variables? :laugh:
Does anyone else like the term "Action Engine"?
-
WHAT ARE SINGLE AND DUAL LINKS ?
Just reading this over lunch;
The video card needs 850 Watts of power to run
!!!!! Guess that leaves laptop computing out of the question...
-
Microsoft .NET is the Microsoft strategy for connecting systems, information, and devices through Web services so people can collaborate and communicate more effectively. .NET technology is integrated throughout Microsoft products, providing the capability to quickly build, deploy, manage, and use connected, security-enhanced solutions through the use of Web services.
LabVIEW already has various methods for connecting systems and sharing information; VI Server, Shared Variables, OPC, etc...
I've been looking at .NET in my spare time. There are numerous collections related to the OS that can be called in a much simpler method than in the past. An example would be calculating an MD5 hash for a particular file. MD5 is often used to validate a file for execution or download into a piece of hardware. I recently read about .NET support for MD5, had a project that could benefit from this sort of check, and had also just downloaded LabVIEW 8 from the NI web site. I put together a few LV .NET nodes, and could calculate the MD5 for the LV8 454MB file in under a minute.
Could it be done natively from within LabVIEW? Of course! Selecting how an where to perform a calculation is a design choice. If I used LabVIEW for LINUX, I would likely use a 'md5sum' function call to the OS rather than writing it myself in LabVIEW. If I implimented the MD5 algorithm myself and did it incorrectly, I might have a problem regardless of which platform I used.
In general, .NET is not of great value from the traditional Aquire, Analyze, Present standpoint of LabVIEW. .NET offers easier access to the OS, rather than having to Google "win32.dll" and "wininet.dll" and create Call Library Function nodes that will fail and crash your LabVIEW session.
My .NET MD5 programming example
I've also started reading this blog; Lycangeek - And now a word from Brian Tyler. Brian heads up the .NET development for LabVIEW at NI.
-
With a desktop this large, I could create larger block diagrams and obfuscate my code to the point where only I could edit it.
Requires a dual-link DVI-D graphics card that supports WQXGA (2560x1600) resolution.
Add the needed graphics card and you're over the price of a LabVIEW Full Dev. license! I guess I'll have to wait until next year, maybe Santa can negotiate a volume discount...
-
Using GPIB instruments?
If you're working with GPIB based equipment, you can use the event registers if your instrument impliments it correctly.
-
Has anyone ever used the Lego products with LabVIEW? The new NXT controller makes this look like I would try this out.
New LEGO MINDSTORMS Software to Be Powered by National Instruments LabVIEW
Lego Mindstorm NXT robots are smarter and stronger than ever!
-
1 GPIB device with 2 PCs
I've used the ICS Electronics 4842 to share an 8 1/2 digit DMM between two computers. I bought mine on EBay for ~$100; I think they cost quite a bit more than that retail though...
http://www.icselect.com/gpib_extdr_ds.html#sw
I saw one today on EBay
-
I agree with your observations, and wasn't aware of the change to allow NI-VISA distribution with a compiled application. I always use some sort of NI hardware (GPIB, etc) so licensing has never been an issue for me. I remembered complaints by LV developers who use serial interfaces, and after reading about the SerialPort class in .NET 2.0 put 2 and 2 together, got 5
, and thought to ask the question.
I did revisit the M$ site and checked the OS support for .NET 2.0.
Supported Operating Systems:
Windows 2000 Service Pack 3
Windows 98
Windows 98 Second Edition
Windows ME
Windows Server 2003
Windows XP Service Pack 2
That doesn't mean it will PERFORM, but it SHOULD at least work. The later versions of NI-VISA only support NT, 2000 and XP. The .NET download is ~22 MB, and required available disk space is listed as 280 MB. The NI-MAX download is ~15 MB, the MAX folder on my PC is ~201 MB.
I noticed that NI makes reference to an open source project called Mono that intends to offer .NET support for Linux and Mac; but I don't understand how it would be called from LabVIEW.
-
I've been playing with .NET in my spare time, The list of new features in the .NET framework v 2.0. includes a SerialPort class.
I've read about some people being dissapointed with having to use VISA in order to access Serial devices; could this new .NET class be used to create a replacement for the old Serial Read/Write functions?
-
Your problem seems to be more of configuration management one. If your LabVIEW stations are "instruments" then you want to treat them as such. I've used many traditional instruments where firmware and even OS updates (Agilent scopes) were regularly released, but I didn't because of dependancies and configuration management.
Don't treat the instrument station as simply another PC on the network. Don't take crap from the IS department. They will likely say "It's a PC, it's on the network, and the company policy is yada, yada, yada... We own it, and it MUST have Netscape, TimeKeeper+ 2000 and the AutoChron Super Backup service running..."
Don't install MS-Office, Norton AV, Firefox, or AOL IM. Don't use the instrument station for non-instrument functions like timeclocking or a print server. Disable Windows sharing, Windows Update, telnet, ftp and all the other unneeded services. Store your data directly on a server and access the results from there. If your IS department can't help you do this, point it out to the IT manager!
To reduce the possibility of viruses, your instrument stations could be placed on a private LAN. The account(s) used by operators would not have admin priviledges. They could be further physically secured by removing floppy drives, disabling USB (disables removable media) and disconnecting the optical drive from the computer's bus.
You might say "Hey, now it's hard for me to update or work on the instrument station. How do I load, debug, etc..." You will have problems with Linux, LVRT etc. The difference is that you won't have to support new OSs, manage LV licenses for different platforms or learn how to relink a kernel!
-
Copied from Info-LabVIEW posting, 29 Sept 2005:
The deluge of thank-you notes and kind expressions that we have received since announcing the LabVIEW Technical Resource's last issue is greatly appreciated. The success of LTR during our 12 years of publication has been due to the great support from the LabVIEW community -- technical reviewers, authors, and customers. We want to express our appreciation to the LabVIEW community.
The difficult decision to end production of LTR was not for profitability reasons, as has been speculated, but rather for rate of return reasons. Historically, applying these same resources and assets to other business/product lines that we own has resulted in higher rates of return.
There has also been some speculation as to whether this publication should convert from print to electronic. LTR successfully converted some of its international product lines to fully electronic versions several years ago. However, the true cost of production of LTR product lines is not in the printing costs, but rather in the editorial, testing, and quality control effort to provide quality, applicable material that supports all platforms and LabVIEW versions. Those that have submitted articles know of the editing, review, and software testing cycles that goes on to provide technically accurate and applicable articles
An idea was proposed on the Info-labview site that LTR should be included in the SSP package. We have discussed this type of value-added addition to the SSP program and other ideas with National Instruments over the years. However, other than successful pilot programs with a few international SSP programs, the idea was never formally adopted.
As far as the future of the LTR Publishing business, two larger publication houses have expressed interest in buying our assets. We chose to stop publication efforts as we work toward this transition.
There is no guarantee at this time that the content will stay LabVIEW based. It is possible our order fulfillment and production processes will be purchased for use with other content.
It has been a great opportunity to work with the LabVIEW community - a passionate group generous with their time. We're proud to have supported the LabVIEW community and run a successful business for 12 years. Thank you again for the outstanding community support.
Karen Pape
Managing Editor
LabVIEW Technical Resource
860 Avenue F Suite 100
Plano, TX 75074
Phone: 214-706-0587 x104
Fax: (214) 706-0506
-
Under the LabVIEW Help menu, select "Search the LabVIEW Bookshelf"
Select LabVIEW Development Guidlines, or view on-line at: http://www.ni.com/pdf/manuals/321393d.pdf
Other style info:
From Bloomy Controls: Five Techniques for Better LabVIEW Code
-
You're welcome...
There was a question about GUIDs on Info-LabVIEW. See attached...
I tried to create a quick .NET version;
Place a .NET Refnum on your front panel.
Associate the Refnum with the class System.Guid You will find this by browsing and selecting the file mscorlib.dll. On my XP machine the path was C:\WINDOWS\Microsoft.NET\Framework\v1.1.4322\mscorlib.dll
Invoke the method NewGuid, then pass the NewGuid Refnum and invoke ToString.
As always, clean up your references by closing them.
From what I've read, this implementation may use a hardware specific value (network MAC address?) as a seed for the GUID.
http://lists.ximian.com/pipermail/mono-lis...ary/003109.html
Dataloggers
in Hardware
Posted
Don't know about air pressure; but I worked in an accredited calibration lab and we used Veriteq. These can be calibrated with FDA compliant tracability.
For various types of loggers, start at MicroDAQ to see what's available on the market. I've never purchased anything from them. Their list of offerings should help you find what you need.
The other option is to "roll your own" using FieldPoint and various transducers. This would likely be more expensive, but you could select the features that you need and setup a process control loop and manage heat/ventilation/humidity devices.