Jump to content

David Boyd

Members
  • Posts

    181
  • Joined

  • Last visited

  • Days Won

    6

Everything posted by David Boyd

  1. QUOTE(Aristos Queue @ Aug 19 2007, 02:15 AM) I can't speak to how reliable the use of the old ShowInplaceMenuItem token was, but I do recall one of its quirks - in fact I think I wrote this text that was up on Brian Renken's page: QUOTE Set to True to add "Show Inplace Set" to the popup menu you get when you right-click on a diagram terminal. Selecting this menu item makes the terminal, and all other terminals sharing the same data inplace, blink together. Pop up again and select "Hide Inplace Set" before moving to another set of terminals, since it won't automatically stop blinking the first set before showing a new one. Otherwise you got a wonderful bunch of little blinky things all over your BD..
  2. QUOTE(NormKirchner @ Aug 17 2007, 01:39 AM) I would've liked to have had one of those colorful tie-dyed T-shirts at NIWeek with the message "Keep Austin Wired". I actually mentioned this in jest to a couple of the NI folks at the registration booth, but of course it's a non-starter idea for the following reasons: - it's too local a joke - the joke doesn't translate - it would probably infringe on the original, which is copyrighted - it's too geeky even for them Dave
  3. I'd like to join the fray but this year the whole family is here, and I got the kids passes to come in and watch the Lego Robotics event. Chris, any chance we could join up later on? We've got our own wheels for this trip. Needless to say, I didn't pre-register, so if we come we'd be paying for our food on our own. As I write this, I'm waiting for the exhibit hall to open up. Been in class all day. I'll look for replies later on this evening. Best wishes, Dave
  4. QUOTE(John Rouse @ Jul 7 2007, 12:33 AM) One other option you could consider, if you're looking for 16+ ports, is a portserver device over ethernet. I recently built a system which used a http://www.digi.com/products/serialservers/etherlite.jsp' target="_blank">Digi 32-port device. A second, private ethernet port on the PC running LabVIEW isn't essential, but is helpful to keep the traffic off the corporate net. Digi's drivers manage the portserver so the application code and VISA only address local com ports. The name mapping from com port to physical socket stays constant, which seems less likely with some USB solutions I've seen.About 40USD per port added, which seemed reasonable. Dave
  5. QUOTE(xtaldaz @ Jul 3 2007, 01:20 PM) Yes, it continues to work very well for me - as you said, it hasn't been rev'd since 2001 but is still useful.QUOTE(xtaldaz @ Jul 3 2007, 01:20 PM) I did get a chance to play around with the DB V2D and timestamps quite a bit last evening and reminded myself of why there were problems. A lot of it has to do with the differences in datatypes in general. A timestamp in LabVIEW is a cluster of 4 I32s, but the VT_DATE (vbDate) in COM is a double precision floating point number that represents the date and the time. I thought the TDM developers used such a cluster, but that was from the days before it became a true datatype. I read that it's a 128-bit fixed-point number, with 64 bits each side of the radix, so I guess that's the same memory footprint. QUOTE(xtaldaz @ Jul 3 2007, 01:20 PM) I'm going to spare you the details of why the DB V2D would need to be completely rewritten to accommodate timestamps properly {unless you want me to go into it in a private message}. My gripe with the xnode is that the primitive version of DB V2D did work with timestamps, at least for me. I have plenty of code that works under LV8 but breaks under LV8.2 because of this. QUOTE(xtaldaz @ Jul 3 2007, 01:20 PM) Let me just say that the way you are doing it now with converting it to a string is actually the best way. I played around with all sorts of crazy conversions and the string approach always worked. Agreed, but solely because the interpretation of the string at runtime can be guaranteed to use the same localized date/time format that the variant to string conversion used. If this were not so, this would be a non-portable mess. I've seen other cases (in the Report Generation toolkit) where LV floating point values get an intermediate conversion to string before being passed to the Office automation methods, and some LV toolkit developer made a one-size-fits-all decision for how many digits are significant. This is why I avoid intermediate string conversions. QUOTE(xtaldaz @ Jul 3 2007, 01:20 PM) I thought LV variants and OLE variants were the same -- thus explaining the same behavior -- but I can't comment on the inner workings of variants. I think of them as separate entities based on the variant display in LabVIEW - if I set 'Show type' it displays 'OLE Variant' and 'Variant type->VT_DATE', etc., for variants returned by ADO or Office automation objects. If wired to the output of the LV 'To Variant' bullet it lists the LV datatype. QUOTE(xtaldaz @ Jul 3 2007, 01:20 PM) You should report that Scan From String behavior to NI as a bug. That doesn't seem like the correct behavior. Done, but I reported the bug against a later version of LV. QUOTE(xtaldaz @ Jul 3 2007, 01:20 PM) I don't work for NI anymore, but since I still use NI products, I'm going to try and attend NIWeek. If I'm at NIWeek, I'll definitely be at the LAVA BBQ dinner. I knew you left NI awhile back, it's great to have your expertise available here in the user community. Perhaps I'll see you at the Salt Lick; this year, the whole family will make the trip (to tour the Austin/Hill Country area while I take the CLA exam), so who knows where we'll go in the evenings. Again, thanks for all your insights. Dave
  6. QUOTE(i2dx @ Jul 3 2007, 01:17 AM) Thanks, Christian. I just checked your toolkit VI (under 8.2.1) and I can tell you that it does not work for me. I think as a general approach you want to avoid trying specific date/time text formats as your VI does. The approach I described (variant converts to string, then scan from string converts to LV timestamp using system's default format) works for me, but I can't be certain it will work under all OS/localization settings. Thanks for your input on this. I've attached a VI (I stole your icon :ninja: ) to illustrate; any testing you could do in your environment would be greatly appreciated. Dave
  7. Hello, Crystal, thanks for the reply. It was from you that I originally got my copy of the DB toolkit so long ago. :thumbup: The DBs I'm currently using are still Access (2003), though I'm planning to migrate to SQL Server Express 2005 soon. My first workaround idea is to use the DB Var to Data xnode to convert the datetime variant to a LV string, then pass the string to a Scan From String with a TS constant and a format specifier of %<%c>T. Hopefully the %c will cause the scan to parse the string in whatever format the target used to write the string in the first place. I should also note that the DB Variant To Data xnode fails for timestamps if the input is a LV variant, as well as an OLE variant. Another quirk I just noticed in Scan From String - if you don't wire a default timestamp value, and the scan fails, the previously converted value is returned. (Sort of reminiscent of the original problem with Variant to Data when passed an OLE null.) All of this is observed under 8.2.1. Any idea why the DBV2D primitive was removed from LV and replaced with an xnode, And, why was the non-DB V2D node never 'fixed' w.r.t OLE variants? See you in Austin next month, perhaps? Dave
  8. Just discovered this while attempting to migrate a project from 8.0 to 8.2.1 which uses the Database Connectivity toolkit... Looks like the Database Variant to Data primitive was mutated into an xnode in LV8.2 (I frankly hadn't noticed). The xnode type-adapts to the LV timestamp datatype when wired, but I can't get it to work. Whether the input variant is a VT_DATE or a VT_BSTR, the output of the node seems to be stuck at zero timestamp. Anybody else notice this? Workarounds would involve an intermediate conversion to and from string representation of time, which always seems a bit iffy. Dave
  9. Ben, Of late, we've used several Vartech 19" rackmount LCDs (running 1280 x 1024 native resolution) which are fitted with capacitive touchscreens. They're used on a general production line environment, and they've been very well accepted by the folks who interact with them. The LV app runs fullscreen and has a minimal number of tabbed pages, using slightly outsized buttons, dropdown listboxes, etc. Not too expensive, though Vartech occasionally sends me marketing emails touting their latest 46" sunlight-readable touch-enabled hi-res NEMA4X-firehose-washdown models, which I don't even want to hear the cost of... ...but there are Vartech models larger than 19" which may fit your requirements without bankrupting your client. Did your question about "large screen sizes affecting performance" imply large desktop pixel dimensions? Even the jumbo (32" and above, widescreen format) monitors I've seen typically only have resolutions of 1920 x 1080 or so, which seems a typical hi-res mode for any good desktop graphics adapter. Good luck! Dave
  10. David Boyd

    LEDarray

    QUOTE(David Boyd @ Mar 26 2007, 07:17 AM) OK, found it (Google is your friend)... http://www.amazon.com/Prince-Ombra-Roderick-MacLeish/dp/0312890249' target="_blank">Prince Ombra - Roderick MacLeish
  11. David Boyd

    LEDarray

    QUOTE(LV Punk @ Mar 26 2007, 06:40 AM) OK, now that we're pretty far off-topic... I remember a science-fiction story in which one of the characters was a girl whose name was mis-pronounced as 'Slally' because of a speech impediment... does anybody here remember the author/title? Chris, was that a typo or a sly reference to the story? Dave
  12. QUOTE(JFM @ Feb 23 2007, 03:07 PM) Nope, this is not a bug. It is IMO the only way timestamp math should work. Remember that timestamps are a LV datatype which represents one thing only: an absolute point in time, timezone-independent, in the LV epoch (January 1, 1904). Valid operations on a timestamp include: Adding a numeric (seconds offset). Result is a new timestamp. Subtracting a numeric (seconds offset). Result is a new timestamp. Note that the subtract primitive only allows the offset to be wired to the subtrahend, not the minuend. Subtracting another timestamp. Result is a numeric (seconds offset). You cannot add two timestamps (adding two absolute points in time is meaningless). You can increment or decrement a timestamp (the unit is, of course, one second). You cannot multiply or divide with a timestamp. You can round a timestamp. This rounds to the nearest whole second. You can compare timestamps with other numeric types, or test for zero value. Transcendental functions, exponentiation, and negation are right out. (Neither shalt thou count to two, excepting as thou proceedest to three...) I'm not sure what 'pain' Herbert refers to. The primitives work just fine for me in all the use cases I just listed. Hope this helps. Dave
  13. You should not need to browse for DLLs... On the block diagram, drop a .NET constructor node (Connectivity->.NET->Constructor Node). From its configuration dialog, select the 'System' assembly from the dropdown. Find the 'System.Diagnostics' entry in the objects list, expand its tree and select 'PerformanceCounter'. From the list of constructors, I chose the constructor prototype which takes two parameters, categoryName and counterName. Wire string constants to these as I showed in the BD snippet. Then you'll need a create a .NET method node for NextValue() as shown. Dispose of the reference when through. If you're going to invoke this repeatedly, you should consider placing the constructor outside the loop, and maintain the reference wire for calling NextValue(). The .NET constructors seem to be pretty time-intensive to execute (in my experience). Hope this is clear. Dave
  14. This BD snippet shows one way to get this info, though it assumes you're running under Windows and have access to .NET methods. Hope this helps. Dave EDIT: The constructor is from System.Diagnostics.
  15. Michael, Are you talking about behavior at edit, or runtime, or both? I've been bothered by similar behavior with tables starting in LV8.0. I have an app which accepts user touchscreen input over a table indicator. To provide feedback for the user, I track the touch/liftoff/drag info and I rewrite the text for the highlighted row using a bold font. If the extent of the bolded text exceeds (ever-so-slightly) the size of the cell the user is pointing to, LV generates the tipstrip, which can interfere with the tracking process. I think there's also an obscure race condition - when I register the touch liftoff (mouse up event), I flip to a different tab of the tab control which owns the table, and sometimes the tipstrip window remains. My wish is that the feature could be disabled on a per-control basis. Best regards, Dave
  16. Chris, I think the opposite is true.. if 'opsserver' is not listed, that implies that the server for Info-LV (and the Igor list too) is not down, at least in the sense that it's pingable w/in Scott's local network. So as Gavin surmised, there must be some other problem. As others have posted, I've had in-digest-ion since the 3rd of January. Best regards, Dave
  17. We PM'd back and forth a few times about this way back when you first posted. The 146C is a controller device for the MFCs. Are you sure you don't have access to the MFCs directly? Can you reply with any of the part numbers for the MFCs?Was this supposed to be a simple control project you intended to implement in LV, or is this part of some much larger system? Still curious... Dave
  18. OK, folks, what am I missing here? I have PDS 8.0 and 8.20 installed on my laptop, and I don't have a 'Graphics' folder under my 'Shared' folder. Does this come with a toolkit I don't have, like Vision? Somewhat puzzled, Dave
  19. OK, Michael, care to tell us where that image is located? Just curious. Dave P.S. I love looking at the member map and spotting the ones who are out in the Arctic Ocean or in the Patan
  20. David Boyd

    STA 300 ,

    This same poster also PM'd me a few hours ago asking for help, and sounding desperate. This is not a good way to get acquainted with the LAVA community. Just because there are listings of "who is online/active" doesn't give you the right to go door-to-door... I dunno, maybe I'm just grouchy today, this is hardly out of the norm anymore... Dave
  21. Chris, You may want to do a 'Save for Previous' and edit your post. The original poster doesn't have lAbViEw 8.20. Dang shift key... :laugh: Dave
  22. Roy, The most reliable method I know (using the Windows API) is attached, it uses two CLNs and returns the login name and the NetBIOS computer name. I've used this since LV5. The attached VI is saved back to 7.1 format for the widest audience. Probably Dirk J.'s VI does the same thing, but his is in 8.2 format and I noticed that you're using LV8. The VI Server property that Yen mentioned may work, but it doesn't always give you the Windows logon name - LV options (tools->options->revision history) may cause it to retrieve the LV registration name, or a 'prompt at startup' name. And I'm not at all sure that it works in built applications. Hope this helps. Dave Download File:post-195-1162995389.vi
  23. I'm another one of those believers in the units feature of LV, I think it's often overlooked and a really clever concept. My understanding of the way LabVIEW handles units for temperature is that degC and degF imply temperatures on their respective scales (with their respective offsets), while Cdeg and Fdeg represent a difference in degrees on the specified scale. Kelvin, of course, is the same either way since it is an absolute scale. So, for example, a temperature gradient value could be described as a PQ with units of Cdeg/m, or K/m, but you wouldn't want to express it as degC/m. Or consider the case of subtracting two values in degF - the answer could be properly labeled in Fdeg, but not degF. Regrettably, since the value on the wire is really just a value in Kelvins, if you create an indicator or constant from the wire, LV has no way to know whether you want to display a temperature difference or a point on a scale. I find it mildly unnerving that when I create a constant on the BD from a control/indicator with temperature units, the constant always shows up as Cdeg - essentially the same as the base units of K. This caused me no end of confusion when I first started using units for temperatures - I didn't get the difference between Cdeg and degC, and assumed that LabVIEW's behavior was somehow broken. And it looks like even in LV 8.20, this bug hasn't been fixed. Dave
  24. AFAIK, there's no such thing as an 'NI supported mass flow controller'. Sensors and instrumentation come from literally thousands of different vendors, with nearly an equal number of protocols, interface architectures, etc. If you're talking about stuff that ultimately wires up as 0-10V, 4-20mA, thermocouple, or digital I/O, there are plenty of hardware solutions from NI which span two orders of cost magnitude depending on channel count, conversion rate, resolution, and such. If it's a garden-variety scope, DMM, power supply, RF voltmeter, spectrum analyzer, etc. from one of dozens of major vendors, and it talks serial, IEEE-488, or Ethernet, probably somebody has already written a set of VIs for communicating with it (ranging in quality from excellent to atrocious). When you get to the more specialty use analytical or process control devices, you may a) luck out and find a vendor with a great package of VIs to control their device (rare in my experience), b) get some half-a$$ed attempt at a LV driver written by a summer intern the vendor had five years ago (far more common), c) feel lucky just to get a protocol document the vendor publishes and start coding your own (my personal favorite), or d) find out that it's your lucky day - somebody on LAVA or Info-LabVIEW has used this instrument and will send you some working code that gets you up and running in a day. So, is this an 1179A series MFC with the RS-485 serial option? 'Cause if it is, you get option (d). Let me know, Dave
  25. Scott. I am using 8.0, never having dealt with the patch/masscomp situation. My official 8.2 CDs arrived this past Friday. I don't know if I'll migrate the present app development to 8.2 anytime soon. Thanks again for all the testing you did on my behalf. Eventually I'll get a bug report into NI for this. Dave
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.