Jump to content

PeterB

Members
  • Posts

    85
  • Joined

  • Last visited

  • Days Won

    1

Everything posted by PeterB

  1. QUOTE(Justin Goeres @ Sep 25 2007, 11:54 PM) Hi Justin, I think your problem is solved by reading http://keyspan.custhelp.com/cgi-bin/keyspan.cfg/php/enduser/std_adp.php?p_sid=-S7T5JMi&p_lva=&p_faqid=403&p_created=1116093562&p_sp=cF9ncmlkc29ydD0mcF9yb3dfY250PTEwNyZwX3NlYXJjaF90ZXh0PSZwX3NlYXJjaF90eXBlPTMmcF9wcm9kX2x2bDECZwX3NvcnRfYnk9ZGZsdCZwX3BhZ2U9Mg**&p_li=' target="_blank">this article from Keyspan. In short you need their high speed adapter, your Qi ain't designed to do what you want. regards Peter "Over the years, Keyspan has made several adapters in the USA-19 line. The USA-19 line is defined as sb erial [A]dapters with [1] DB[9] port (these characteristics make up the part number [uSA-19]). The following information outlines the difference between each variant of the USA-19 line: Product Name: USB PDA Adapter Model Numbers: USA-19, USA-19x, USA-19Q, USA-19Qi - - - - - - - - - - - - - - - - - - - - - - - - - - The models in the USB PDA Adapter line (models USA-19, USA-19x, USA-19Q, USA-19Qi) were designed for connecting Palm Pilots, Wacom tablets, and other PDAs to Mac and Windows computers. These adapters only had enough serial capabilities to communicate with PDAs and graphics tablets and therefore using these adapters with serial devices that aren't PDAs or graphics tablets IS NOT RECOMMENDED (the USA-19HS is the recommended adapter for any serial device). "
  2. QUOTE(agonified @ Aug 8 2007, 08:21 AM) In my experience the Belkin F5U409 USB to serial adapters can be a huge time waster. The F5U409 is very simiilar to the F5U109. Their drivers are unstable and don't work under XP despite claims to the contrary. and it certainly won't work under 64bit Vista (unlike NI and Keyspan devices). I've had a Belkin simply lock up an XP PC just by plugging it in. Under W2K there have been several cases of the Belkin driver causing a <script>BSOD. For a while I was pursuing the option of using some well respected brands of USB-serial converters such as Keyspan and National Instruments. Those devices are certainly better designed (hardware and software wise) and more thoroughly supported than the Belkin brand, however there are shortcomings of even those devices which have been well documented here by Keyspan and here by NI. Not all aspects of the RS-232 protocol are implemented in the USB drivers. At least NI and Keyspan are honest enough to tell you that. Some of our applications at work require low latency serial communications, something which the USB-serial devices aren't able to achieve. The amount of time (i.e. $) required to validate the operation of Keyspan and NI USB-serial adapters for use with all applications we currently use the adapters for (w.r.t latency, timing and synchronisation), would be better spent on purchasing a more adequate solution, that by design will perform as good as a standard serial port. It was for this reason that we purchased dedicated Serial cards for our testing to provide additional COM ports on the desktop PCs. For a laptop, I'd recomend buying serial ports that plug into the expansion slot (Cardbus, PCMCIA, eXpressCard etc whatever bus you have [1]) . This way you get a dedicated UART per COM port, so you are NOT at the mercy of a poorly written USB device driver running under a jittery operating system coupled with the additional latency of being routed through an on-board USB hub. If for whatever reason you must go with a USB to serial converter I did all the reasearch and can only recomend two brands. Either NI, or Keyspan. Do youself a favour spend the extra couple of hundred $ and save yourself many times that when it works first time ! I've been there and done that enough times to know better !! regards Peter [1] The Solution - use the laptop's extension port to add more serial ports. So how do you directly access the bus on a laptop? Laptops allow you to access their bus via one of the three following extension card ports (depending on the age of the laptop) PCMCIA port (aka PC Card). This is the oldest technology, and is equivalent to the old ISA ports on desktop PCs. The bus speed is more than adequate for supporting an additional 2 to 4 serial ports via a PCMCIA serial card CardBus port. This is equivalent to the PCI extension sockets on desktop PCs ExpressCard. This option is now available on the latest laptops and is equivalent to the PCI Express extension sockets on desktop PCs. You can plug 1 into 2 (you can't however plug 2 into 1). You also can't plug 1 or 2 into 3. This means that we will need to purchase one of two possible types of serial cards for our laptops (when needed of course). Either a PCMCIA serial card, or an ExpressCard serial card. <script> A decent brand 2 port PCMCIA serial card costs between A$350-$500. In comparison, a decent brand 2 port USB-serial adapter costs between $160 and $500.
  3. QUOTE(rolfk @ Jul 5 2007, 01:26 AM) Excellent ! I'm glad you came up with a solution for me :thumbup: . Thank you indeed for elucidating the situation Rolf. regards Peter
  4. QUOTE(rolfk @ Jul 4 2007, 03:37 PM) Thanks for your reply Rolf, but I still don't get it :question: In the LabVIEW project manager in the build specifications section (not from within Visual C etc), where do I specify to link in the VISSIM32.LIB file prior to compiling my LabVIEW shared library (DLL) under LabVIEW ? regards Peter
  5. Hi there, I would like to write a DLL with LabVIEW. The purpose of this DLL will be to act as a (shared memory) repository to allow data to be exchanged between my LabVIEW program and a VisSim program. (up until now I have used DDE to do this, but I would like to change the approach for a no. of reasons that I won't get into now) Before I get into discussions about the finer details (i.e. UI thread, reentrant, mutex protection etc) I need to know if I can actually compile this DLL using LabVIEW so that VisSim gets what it needs. Below you can see an extract from the VisSim user manual explaining how to write 3rd party DLLs that VisSim can call. It mentions .LIB and .OBJ files. When I have previously compiled DLLs using LabVIEW I have never needed to link in .LIB or .OBJ files before. Is that even possible with LabVIEW? If it isn't then can I just create an OBJ file from LabVIEW and link that to VISSIM32.LIB using another compiler ? (BTW the attached diagram shows a 'custom dialog box' which I don't need to use) regards Peter P.S. this post is referred to from Info-LabVIEW (digest) on 07/04/07 Criteria for writing DLLs You can write DLLs in any language, provided the language has the following capabilities: • 64-bit floating point array parameters • Pointers to 16-bit integers • _stdcall calling conventions (default for Microsoft Fortran and Delphi Pascal) Example DLLs written in C, Fortran, and Pascal are distributed with VisSim and reside in subdirectories under the \VISSIM60\VSDK directory.
  6. QUOTE(crelf @ Apr 5 2007, 10:33 PM) But you sounded so convincing Chris. In fact when you later wrote that the experience endowed one with humility, I thought you were publically pointing to the egg on your face. Yet it seems that you'd like us to believe you were joking. So now that you've come up with a good excuse perhaps I should call your bluff and check with Ruth For another reason though, I still think with all that egg on your face that the yolk really is on you [1]. cheers Peter [1] "In some countries, including the United Kingdom, Australia and New Zealand, the April 1 tradition requires jokes to be played before midday: if somebody pulls an April Fools' Trick after midday, then the person pulling the trick is actually considered the fool." ref http://tinyurl.com/38y59q' target="_blank">Wikipedia And the timestamp on your post in Australia is Apr 2 2006, 07:58 AM :laugh:
  7. QUOTE(crelf @ Apr 2 2006, 08:58 AM) Chris, does this mean you would really appreciate NI taking over LAVA ? (you gotta remember that you assumed Michael was serious at the time, so you need a pretty good excuse to change you views on this one !) Personally I think that NI needs to be confronted with more independent outfits like LAVA to keep them on their toes. Additionally, if there was a viable 3rd party (or even open source) version of G, then NI would be bending over backwards to make sure LabVIEW was a competitive product in as many ways possible. Your reply to Michael's post makes it sound like you are on NI's payroll cheers from downunder. Peter
  8. QUOTE(Tomi Maila @ Mar 16 2007, 06:37 AM) A cheaper option that I use is to roll up a small towel and slide it inside the pillow cover so that it rests under the front edge of the pillow - closest to the bed side of the pillow (as opposed to the top side of the pillow) The technique is briefly referred to http://www.abc.net.au/rn/talks/8.30/helthrpt/stories/s526629.htm' target="_blank">here. (2nd last paragraph in the interview transcript) regards Peter
  9. The plate motor.VI is a snippet of a much larger User Interface. We are going to have 3 seperate motors, 6 stepper motors, Temp Controller And two Air Controllers. The ID, Min & Max are there to enable Whoever is designing the user interface to add Min/Max Speed Values for the connected motor and input an ID that is Drive specific, If we didn't add these items here, the re-entrant method would not work right for the method that we are using. <snip> Hi baptie, unfortunately you didn't answer my question, however after I took a closer look at ther code you posted, I was able to understand what you are doing. You are using a notifier as a local variable, and you are POLLING its values (speed, direction and 'enable') once every second. The solution you have implemented does work but it has the following drawbacks: the lag time of up to 1 second could be annoying for the operator. With the number of motors you have to simultaneously control, a scalable polling architecture could begin to place an unecessary burden on CPU usage. I would like to suggest that if you have the time (or in the future) that you consider using the full capability of notifiers (or even an event structure) to implement an EVENT based architecture rather than a POLLING based one. Such a solution would be scalable without wasting additional CPU time when idling. By 'idling' I refer to the time when the user is not changing any controls on the front panel. If you are interested in knowing more, I am happy to write some details on the topic. regards Peter
  10. Hi Derek, I'm coming in late on this thread as I've only just caught up on reading a lot of posts. I have a question for you. In order for the motors' speed (or other parameters such as max, min) to be set, do the commands need to be sent regularly to the motor in a timed loop even if the operator has not adjusted the speed, min or max control setting? regards Peter
  11. Yes, NI has humourous employees, although having said that I'm assuming that one of them (Colin) probably wasn't laughing when he tried out the experimental diet cola. (hint: Ctrl+F) Cool app. I never knew it was there. Peter
  12. Ah yes, you mean (has video as well). Now my memories are flooding back as I recall playing this song on a dual register Yamaha organ when I was like 11 years old and singing it for my folks as I practised it. Those ABC Song Books were great fun to sing along to at home and in primary school.Those were the days.... Peter
  13. You old nostaligic you perhaps you have a case of Peter Allen-itis, the only song sample I found online is missing the bit where he sings "I Still Call Australia Home" cheers Peter
  14. I loved this feature so much ('cause I despised the 'hot pink' colour of error clusters), that I've been using it ever since it came out in LV 7.0 :thumbup: My least favourite colours are reds, pinks and yellows, whereas my most favourite colours are blues, greens and purples. Now there's an idea how about a purple wire for something ? regards Peter
  15. Now that's geeky :laugh: regards Peter
  16. Oops, that 1st line is a typo (I'll have to check if that is in the hard copy). No <CTRL> is required (although it won't hurt), so it should read "<double click> = On a clear point, places text (otherwise known as a free label)" BTW this Engineering Handbook is the biggest book I have ever owned. It's at least 13-14cm thick ! Peter
  17. This is true if one has the Automatic tool locked on (thanks for reminding me). If one doesn't then <SHIFT+TAB> is necessary to (en/dis)able the Automatic tool. Another thing to remember with the Automatic tool enabled is that that double clicking on text will edit it and double clicking on open space will place down 'free text'. I think that this may be something that new users don't realise and so they think they need to temporarily disable the Automatic tool, select the text tool, type their text then re-enable the Automatic tool. This would put me off using it pretty quickly. Basically, except for colouring things, you should be able to program with the Automatic Tool enabled for 99% of the time. If this isn't happening for you then please ask us why and we will help you out - because you may not know about all the options (I'm reserving 1% for all those people who will come back with a valid reason to temporarily disable it - if none are forthcoming I will happily change that number to 100% ) On the topic of reducing keyboard interaction (while programming LabVIEW with the autotool), maybe we could collate a list of our most frequently used keyboard shortcuts (e.g. <CTRL+Z> for undo) and then figure a way that NI could integrate these actions into the autotool (or certain mouse gestures a-la the 2nd LAVA Coding Challenge http://forums.lavag.org/index.php?showtopic=3423). After all, the autotool sits idle when over blank spaces as it just shows a cross (+). regards Peter
  18. While I was compsing an email to my LabVIEW colleagues at work to espouse the virtues of the autotool ('cause I'm an avid fan of it :thumbup: ), it dawned on me that I still actually need to disable the autotool when I want to select the paint prush to change the colour of anything. I do this by pressing <Shift>+<right-click> to bring up the tool palette right under my mouse then select the paint brush, then when I'm finished I select <Shift>+<right-click> and select the autotool to 'on'. So short of NI developers plugging into my synapses, does anybody have a good idea for how a future release of LV should autodetect our need to change the colour of something? regards Peter
  19. Nah, 'cause all of the shortcuts are listed in the LV help file in one place anyway. Just type in "keyboard shortcuts" for the index keywords. regards Peter
  20. If you have Automatic Wire Routing enabled (in the BD options) the LV help file says to: Press the <A> key. LabVIEW temporarily disables automatic wire routing for the current wire. but did you also know that by pressing the spacebar while the dotted wire shows will rotate the right angle in the route by 180 degrees ?(works for both auto and manual wire routing options) regards Peter
  21. There is a non-cosmetic reason for this too. If you are moving an object that is not wired up to anything, by pressing the spacebar and revealing its image again, LV gets ready to automatically wire it up if a type matched source or sink is within the designated number of pixels. regards Peter
  22. I have some data regularly arriving in an array which can vary in length (e.g. 16, 32, 64 points etc) the array length just depends on how much data is available at that particular instant. I wanted to decimate the incoming data by a factor of say d=100. None of the existing decimate or resample vis in LV seemed to be able to do this. That is cater for when the decimation factor (d) > the length of the smaller arrays. After initially posting a request for such a vi on Info-LabVIEW, I have now decided to write my own. To that end I am posting my solution here Download File:post-1272-1143009789.vi regards Peter
  23. Biren has now solved the unexpected behaviour for me by pointing out that .. "when you change the offset you should be supplying an offset into the future and not 0. If I am changing the offset I usually add the expected start to period to get the new offset" This now makes sense and works OK Download File:post-1272-1142563359.vi If only NI had shown me this by way of example it would have clicked much sooner for me. Even though Biren had suggested that examples are available in LV 7.1, none of them contain a timed loop with the mode and /or offset wired up from inside the loop. There's also no indication of how to properly effect a mode change in the LV user manual - other than to say "To set the mode, right-click the Mode input terminal of the Right Data Node and select Create
  24. It does Hold down <SHIFT> while nudging with the arrow keys and it snaps to the grid. regards Peter
  25. If you work with a grid on your BD (or FP) then the following may prove handy to know. Do you get frustrated when you want to move things around but in doing so all your lovely straight wires on the BD become crooked again? Well with a single keystroke, this bothersome behaviour can be avoided. Simply hold down the <SHIFT> key before clicking on an object or a selection and move with confidence ! Your movements will be constrained to Left/Right or Up/Down. "Before moving a selection.png" , "After moving a selection - shift unused.png" , "After moving a selection while holding down shift" regards Peter
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.