Jump to content

RandyR

Members
  • Posts

    19
  • Joined

  • Last visited

    Never

RandyR's Achievements

Newbie

Newbie (1/14)

0

Reputation

  1. I do love the shared variable and have had much success replacing my homegrown UDP client server mechanism to get data between a Windows Host and a ETS desktop RT box. I do have a couple questions, though... I can't get my shared variable clusters to slave to my strict type-defs. I point them to the type-def as I configure them, but, a change in the cluster requires me to redefine the shared variable by re-browsing for it. Kinda clumsy -- Am I doing this wrong? Also, what UDP ports does the shared variable system use? While I'm at it, what ports does LV use for panel updates and all the rest of the under-the-hood traffic to the RT Box? I ask because I also have some UDP of my own going and I'm seeing re-connect speeds and other slow-downs that make me wonder if I'm stomping on some LV UDP traffic. Unlikely, given the number of ports, but worth asking about, I think. Thanks, Randy
  2. I assume you're talking about large fonts... You can, in the .ini file for your app, set a specific font that will override the Windows large font setting. The way I've discovered best to handle this is to set the 'application', 'dialog' and 'system' font setting to custom and select my desired font in the LV development environment for developing the app. Then, just make sure you copy the font setting out of the LabVIEW.ini file into your app's ini file. The font settings look like this... appFont="Tahoma" 13 dialogFont="Tahoma" 13 systemFont="Tahoma" 13 A couple of hints: I don't set any font in my app (on the controls or indicators, etc.) to anything other than application, dialog or system. In other words, never set a specific font on a front panel, always let the desired font propagate to the controls/indicators via the default methods. Along the same lines, when you're setting, say, the application font default in the options screen, never check either the 'panel default' or 'diagram default' in the font browser. This will make stuff have specific font settings (other than application, dialog or system) and can drive you mad tracking it all down to get everything to match up. Obviously, there are exceptions to this when you may need an indicator or two to stand out or something. Also, you need to be sure that the font you choose is going to be present on the other machines the app will install on. As best as I can tell, 'Tahoma' is the default 2000/XP font, so that's the one I use. Randy
  3. The trouble with hard coding a solution into a case statement is the tediousness as well as the lack of being able to easily modify the table. And of course, with the case statement, you can't change it at all in an exe. Here's how I'd do it... Simple and easily configurable. Randy
  4. If I'm understanding you guys right, what you're wanting to do is set the XScale.Offset value to the current 'seconds since' and then write your data. Obviously, there are other issues depending on your particular application, but that's the general idea. Also, if your samples have a delta time not equal to 1 second, you're going to need to set XScale.Multiplier. Randy
  5. BTW, I should mention, as I understand it, the tick count represents milliseconds since the computer was booted. I tell you this so you understand why it sometimes has small values and seems to works and other times it has large values. This timer will roll over in 49.7 days, however, if you always subtract tick now from tick then and get a UINT as an answer, the roll over will not effect the delta time. It's cool how this works out. Like this: 2 - 255 still equals 3 in the case of U8s... So you can see, the delta time always gets the right answer unless you roll the timer twice -- Unlikely, most times, with a U32. Randy
  6. The reason the time stamps are jumping is because you are coercing a real big U32 to a single float. Just type one of your timestamps into a float control and try to add one milllisecond to it, you'll see what I mean. Depending on how long your program runs, you may be able to make this work by making your timestamp a delta time... Get the tick before the loop and then subtract the current tick from the the original value. Also, converting milliseconds to seconds will help since you can then use some of the 'right of the decimal part of the float'. Keep it coming, we'll get this going Randy
  7. Your initial impression that the 200 sam/sec means that the sample delta time is 1/200 is correct. You let yourself get talked out of it due the time stamps created during the *polling* of the AI buffer. Just because your polling of this buffer is non-deterministic does not mean that the determinism during *filling* the buffer is suspect. The continuous analog input task you have setup is hardware timed, meaning the the board has it's own clock that is running the AD conversions. You can depend on the delta times of the samples to be very accurate. Now, how you synchronize your drive signals with the AI data is another issue altogether and completely depends on how you write your program. Let me know if this does anything for you Randy
  8. You're welcome... Glad I could help. Randy
  9. You were basically on the right track, although I think this is a bit more straight-forward. Randy Download File:post-767-1106019390.vi
  10. Now I'm a bit confused... Time stamping the AI is implied... There's no reason to check your clock, you can tell by the sample number (kept track of by you) what time the sample happened at (relative time, that is). This makes me wonder if you're trying to timestamp the drive commands sent by you? That's an entirely different animal. Is this the case? Randy
  11. Hi James, If you don't know what RT is, then, as you figured out, you don't have it It is very cool however, if you need it. Just look in the NI catalog or website to get a description. You're already running in the fashion I suggested, that is, buffered input. When you set the AI Start to continuous / 200sam/sec, you made the AI hardware timed by the board itself. This means that when you read the samples, you know that they have a delta time of 1/200 (5msec). There's your timestamp right there. When you pluck out the first sample of 50 (the amount you read per loop) and go off to the motor contol section, you know the delta time for these samples is 250ms (50 of 200). BTW, you'd get better data if you averaged the 50 of channel 3 instead of just using the first element. So, to answer your first question, to plot this data just run it to a waveform chart. Since you have evenly spaced data, there's no need for a XY graph. Set the chart X axis delta time to match the data if you want accurate X scale time values. Even if you don't do this, the data will plot correctly... Except that the x axis would be in units of samples instead of time. Hope this helps, Randy
  12. Yeah, if you're in real-time, then disregard my message... Derek's answer is appropriate. I realize this is the real-time sub-forum, but, since LV6.1 with a LabPC+ was mentioned, I assumed we were dealing with regular LV. Could this be RT? Randy
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.