Jump to content

Jeff Plotzke

Members
  • Posts

    144
  • Joined

  • Last visited

Everything posted by Jeff Plotzke

  1. QUOTE(alnaimi @ Feb 26 2007, 05:17 AM) What exactly isn't working? Are you not getting data into your VI? Is your data not what you're expecting? Is your graph not showing your data?
  2. QUOTE(Jim Kring @ Feb 25 2007, 10:20 PM) I normally use the NI Example Finder when I want to use a DAQ example for something a little more powerful than a MAX Test Panel when testing out some hardware... But I like the idea of OpenG examples appearing there!
  3. QUOTE(alnaimi @ Feb 25 2007, 03:38 PM) Attached is the VI saved for LV 8.0.
  4. QUOTE(alnaimi @ Feb 25 2007, 02:55 PM) My main concern is that you're generating x samples of a sine wave (I'm assuming x > 1). The 'Generate Signal' VI returns much more than one sample -- It's actually an array of samples -- As you'll see on the graph you have wired up to output of that VI. So, let's say you're generating a complete sine wave during each iteration. You then output that using your Analog Output. Your analog output will output the entire sine wave as it was generated. However, you then read your Analog Input at some point in time (and remember data flow -- You have no wires from your analog output to your analog input, so it's very possible that you're reading from your analog input before you output any signal!) Since you don't know what your analog output was outputing when you took your analog input sample, you really don't know what voltage (Vout) to connect that point to. I've attached a quick VI that shows you a way to generate an entire sine wave, and then output, point by point, each value of that sine wave, while reading your Vd each time. This way, you can accurately synchronize your input and output data and create your graph. Let me know if you're still having problems.
  5. QUOTE(alnaimi @ Feb 25 2007, 09:39 AM) I see three things that you'll want to look at: 1. How many samples is the 'Simulate Signal' Express VI set to generate? If you're generating multiple samples, you're going to run into a problem since you're only reading one sample from your AI. You'll have no way to synchronize your data together. I'd recommend outputing a single value for Vout, then read the AI for Vin, for each iteration of your loop. This way you can synchronize your data points together. 2. Is your 'Build XY Graph' Express VI set to reset every time it's called? (I believe that's the default) I would wire a false constant to the 'Reset' terminal to ensure that it doesn't clear your graph every time. 3. Your math looks wrong in the VI. I agree that Id = (Vout - Vd) / R -- However in your VI it looks like you calculate Id = (Vd - Vout) / R However, this would just end up inverting your results. You look like you're on the right track -- Hope this helps.
  6. QUOTE(alnaimi @ Feb 25 2007, 09:39 AM) I think you forgot to attach your VI.
  7. QUOTE(Guillaume Lessard @ Feb 20 2007, 03:34 PM) Exactly... and in a perfect world, you wouldn't have any UIs down on your RT system to maximize efficiency and reduce jitter. What eventually solved this problem was replacing the Boolean button with a physical digital input -- This now allowed them to continously hold down a physical button as I poll the input and increment accordingly.
  8. QUOTE(tcplomp @ Feb 20 2007, 11:41 AM) I expected that, until seeing that calling 'DevNames' before 'GlobalChans' gives me all correct data, without adding any delay. Perhaps it's just coincidence?
  9. Just sending a quick update: I've sent my example VI to NI R&D and they're reviewing it. It's interesting to see that others have seen similar issues on other platforms... I'll post more (and hopefully a CAR#) once I hear back.
  10. QUOTE(ned @ Feb 19 2007, 05:35 PM) While this VI looks like it would work, I don't understand why you're converting to a 1-D array. If it were me, I'd store all my internal data in a large 2-D array (channels x samples). Then, when you grab the data from the AI Read, do a replace array subset with the 2-D data, putting it in the correct places. (And this way, you get rid of the extra buffer allocations for 'Reshape 1D Array') However, keep in mind that this will be a lot of data in memory. 300K points (5 minutes at 1KHz) for 16 AI channels in DBL precision = 38.4 MB of data. If anything happens to the computer while acquiring the data, the entire set is lost. You may want to consider offloading to disk once and a while and maintaining a circular buffer on disk.
  11. Here's a nasty bug I found: On LVRT, if you create a VI which has any types of Waveform Charts/Graphs or Clusters on it, and have "Show Front Panel When Called" set in the VI Properties, connecting via Remote Front Panel to this VI after it has already been running will result in the Charts/Graphs/Clusters not updating at all while viewing it. Note that this only occurs with a deployed RT EXE (because this only manifests when the VI with the indicators is called *before* any UI is displayed.) I've attached a ZIP with a LV Project and VIs that demonstrate the problem. The Workaround: Uncheck the "Show Front Panel When Called" checkbox. Note that you must then alter your build specification to not remove the front panel of the VI so that you can still connect via Remote Front Panel. This has been confirmed by NI R&D on both Fieldpoint and RT PXI. CAR# 46IG9S2A
  12. QUOTE(crelf @ Feb 18 2007, 09:59 PM) OK, I've done more testing, and found the strangest results. Here's a summary: -The 'DevNames' property is the only property I can find that gives different results based on a time delay after booting the RT controller. -The 'DevNames' property only exhibits this behavior when it is in the same property node as 'GlobalChans' and follows 'GlobalChans'. (Not necessarily immediately following, it returns incorrect results even if a property is between 'GlobalChans' and 'DevNames') -If 'DevNames' is alone in a property node, it will return the correct cards without any delay. -If 'DevNames' is preceded by properties other than 'GlobalChans', it will return the correct cards without any delay. -If 'DevNames' is proceded by 'GlobalChans', it will return the correct cards without any delay. In the case that 'GlobalChans' precedes 'DevNames', a wait time before calling the property node will fix the issue. The pause itself is repeatable, at least on the one controller I'm using. I've been doing testing with multiple times with wait times from 0 sec - 20 secs. In any instance where the wait was below 20 seconds all the cards were not returned. I do notice that in the case where all cards are not returned by the property node, the cards that it does return are the ones closest to the controller. I don't know if this is simply coincidence... I'm attaching the VI I've used for testing. I've documented in my disable structure which frames give correct results and which frames give incorrect results. I'll contact NI tomorrow regarding this bug. I'll post again once I get more information. I'll add that if anyone tests my VI with their controller -- This behavior is only seen once you build, deploy, set as startup, reboot, then connect via Remote Front Panel. If you simply execute this code on RT without deploying and rebooting, it will work fine.
  13. Here's a discovery I made today: If you're using DAQmx Property Nodes in an RT application, they may not work correctly until about 5 seconds after the RT system has been running. In my case, I'm using a PXI RT controller and my application calls the "DevNames" DAQmx System property to return all the devices on the system when it begins. However, after a long search, I've discovered that this node gave back only 1 of my 7 cards on the system when it executed in a deployed startup application. If I put a 5 second wait before this property node, it works fine, returning all the cards. I don't know if this is a "bug" or just that LVRT needs to discover all the cards first, but I wanted to post this in case anyone experiences this problem.
  14. When I'm building my LVRT (8.20) project, I'm prompted 4 times for the block diagram passwords of four NI system VIs regarding shared variables. My project does use shared variables, so I'm not doubting that LV is putting in these system VIs to read/write, but I find it odd that it prompts me for their passwords. If I just click 'OK' 4 times (and ignore the "Incorrect Password!" responses), it builds without a problem. Anyone else expirience this? I'm guessing that NI forgot to mark these as system VIs so it doesn't ask.
  15. QUOTE(Guillaume Lessard @ Feb 13 2007, 04:23 PM) Thanks, Guillaume. Running the UI on the Windows machine I think is the only solution to this problem. Monitoring for the mouse isn't possible with RT (and is probably why the two mechanical actions aren't supported).
  16. It looks like your VI is not in edit mode. Go to Operate > Change to Edit Mode (or press Ctrl+M) to change it. You may have the option turned on which opens a read-only VI in run mode automatically (which is probably a good idea to have to remind you that you can't save the VI). To check if you have this enabled, Tools > Options > Environment > Treat read-only VIs as locked
  17. QUOTE(alameer @ Feb 18 2007, 02:03 PM) Alameer, what problem are you facing? Are you getting an error message? What error message? We need to know much more information before we can help you. Did you make your VI read-only?
  18. QUOTE(mross @ Feb 17 2007, 01:22 PM) Do you have Javascript enabled? When you click on the RSS button, a little submenu should appear showing "-All of LAVA-", but it's triggered using Javascript. In Firefox, Tools > Options > Content > Enable Javascript Hopefully that fixes it...
  19. QUOTE(mross @ Feb 17 2007, 09:30 AM) Ah ha -- The RSS icon has moved since the tutorial was written. It's in the extreme lower left of the page -- Looks like an orange square with white waves. I attached a picture. I'll PM Michael to update the RSS tutorial post.
  20. QUOTE(James P. Martin @ Feb 17 2007, 06:25 AM) I would suggest taking a look at the M-Series DAQ Manual here: http://www.ni.com/pdf/manuals/371022g.pdf Page 7-19 (Page 117 in the PDF) shows the counter outputs applications. Page A-6 (Page 172 in the PDF) shows the pinouts for your PCI-6221 DAQ card. So, if you're using counter 0, the pulse train output will be on pin "CTR 0 OUT", pin 2 of the DAQ card. Use any of the "DGND" pins for a ground connection.
  21. QUOTE(mross @ Feb 17 2007, 09:00 AM) This tutorial should give you a good start: http://forums.lavag.org/RSS-Feeds-for-the-...orums-t767.html
  22. QUOTE(bjarket @ Feb 16 2007, 08:59 AM) You'll want to put both counter virtual channels into one DAQmx Task. If you're doing this all in software, create both of the virtual channels, then use the "DAQmx Create Task.vi" to create a task. (You'll need to also use the "DAQmx Flatten Channel String.vi" to take an array of virtual channels to the string Create Task wants). Then, once you start the task, it will start both of the counters at the same time.
  23. QUOTE(Thang Nguyen @ Feb 15 2007, 07:31 PM) As Rolf said, there's not much you can do for serial other than to see if you get a response back. In the cases I wanted to detect a serial link disconnect, I'd find some sort of "status" command to send to the remote device (some command that didn't do anything other than return data) and send that every second or two. If I didn't hear back a response after a few tries, I considered the device disconnected. So, I'd probably try and find some command that you can send to the devices so you can detect if the device is still connected.
  24. I just found an answer. You can do this one of two ways: You can modify the IP address of the host in the project .aliases file. Shared variables use this file to resolve host name to IP address. However, I'm guessing that this file is only read once when labview starts up, so you'd have to modify this manually before you start the application. http://digital.ni.com/public.nsf/websearch...84?OpenDocument If you have LabVIEW DSC, you can programmably change the host for each shared variable through property nodes. (It looks like you'd have to do this for every shared variable you're using...) Note that it mentions that if this is the only DSC feature you're using (You're not using the alarms, logging, etc.), you don't need the DSC run-time engine to deploy an EXE with this feature -- Although you have to select "Enable Enhanced DSC Run-Time Support" at build time. http://digital.ni.com/public.nsf/websearch...44?OpenDocument
  25. I have two targets reading/writing to a set of shared variables -- One is a PC, one is a PXI RT system (but that really shouldn't matter -- The same question applies with two desktop PC targets). I have the shared variable engine running on the RT system. In my LV project, I have the RT system added with it's IP address; everything works great passing data between the two systems. However, when I deploy this system, move the RT system, and the RT IP address changes, will the PC target still be able to communicate? Does the PSP engine try to connect through hostname or through IP address -- and is there a way I can tell the PSP engine which specific IP address to connect to? Thanks!
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.