Jump to content

ShaunR

Members
  • Posts

    4,939
  • Joined

  • Days Won

    306

Posts posted by ShaunR

  1. This would actually need to be checked. If a computer comes with a parallel port it's likely to be 3.3V, not 5V. I know the old Dells we still have in the lab are 3.3V parallel ports.

    Why is 3,3v a problem?

    In the end, it's probably better to go with an off-the-shelf cheap USB-based digital I/O module. There's tons of these on the market.

    Chicken :lol: Seriously though. This is kindergarten stuff. But if you've never used a screwdriver as a chisel, then maybe it's better to just throw money at it. :D

    melman.jpg

    Not one of mine by the way :P

  2. Can Labview control the LED's via the PC parallel port (old printer port)?

    Much better than serial (8 bit bi-directional, i.e digital inputs OR outputs :) ). My favourite low cost digital IO. Fantastic for foot-switches and system monitoring and essentially free. Unfortunately, not many PCs come with them nowadays.

    http://digital.ni.com/public.nsf/allkb/B937AC4D8664E37886257206000551CB

    There are also a couple of examples in the "Example" finder.

    You have to check whether your motherboard already has pull-up resistors (most do, some don't). Then you can connect 5V LEDs directly or just short them to ground (if using as digital in). Note that logic is reversed since you sink to ground THROUGH the IO line to light an LED. I always stick a transistor in there too to be on the safe-side, since if you get it wrong...you blow the port. It also inverts the logic so I don't get confused (happens regularly).

    http://www.beyondlogic.org/spp/parallel.htm

    • Like 2
  3. I've been using 2011x64 a bit lately, and the speed hasn't been an issue at all. Now, the functions don't always work, but that's a separate issue. What have you found?

    Not an issue. Comparatively. X64 is slower in most (all?) of my use cases than X32 (regardless of LV version) So I was simply implying compare apples with apples.

    Don't get me started on the "don't always work". Still can't back-save to <8.5 and 2011 x32 just crashes on start-up since it was installed :shifty:

  4. *********************************** ERROR *************************************

    ERROR: 3.1 kernels are not supported! **

    Running a 2.4.x or 2.6.x kernel is required to continue this installation

    ************************************ ERROR ************************************

    Thats as far as I got on 12.1 before switching to 11.4.

    I don't have a 2009 linux distro (rocking-horse droppings :P) so unfortunately cannot try it to see if I get the same problem. All I can say is that 8.5 started ok and even installed everything properly on 11.4.

    Perhaps a call to NI?

  5. Thanks Shaun

    I tried installing gcc but still it didn't work :-(

    I followed this guide line:

    1. Goto YaST Control Center → Software → Software Management

    2. In search box type gcc then it will show packages

    3. Select gcc then accept

    4. Then gcc will be installed

    //Mike

    this is what I did.

    1. Goto YaST Control Center → Software → Software Management

    2. View → Patterns

    3 Select "Base Development" check box.

    4. Install.

    After that I went to "Search" and typed in "kernel". Selected "kernel source" (which is 3.1 for 12.1 and 2.6.37 for 11.4) and then installed that.

    Like I said. I had problems with the NIVisa on 12.1 (which is one of the ones that needed the source and gcc) but no problems on 11.4.

  6. This is probably a case of the blind leading the blind but....

    you've got a few errors e.g

    gcc Not found in current path*** ERROR: Some required tools are missing or were not found. ***[/CODE]
    
    I got a little further (since I have gcc and the kernel source installed) but then got this error
    [code] *********************************** ERROR *************************************               	ERROR: 3.1 kernels are not supported!                    ** Running a 2.4.x or 2.6.x kernel is required to continue this installation. ************************************ ERROR ************************************

    So I'm guessing it won't work (properly?) with 12.1. I am just going through installing 11.4 (which uses kernel version 2.6.x).

    NB:

    These errors were for only the GPIB and the VISA but I'm not going to faff around with seeing which "parts" might work although Labview did run without quitting (version 8.5)

    • Like 1
  7. I was hoping you would post this; it will be useful in my next project. Thanks.

    Actually this is a slightly older one than I use currently. The latest one passes the errors via a queue to an asynchronous process that then writes the to DB but the main bulk of it is still the same.

  8. I totally agree with you about your comments regarding errors, but to me there is far more to logging than that. There can be problems in code that do not result in errors but incorrect results or when you create a new feature or application you want to actively debug it. I think the blog sort of covers that aspect quite well.

    On my previous system if we turned on full logging and ran a test the result was a several Mb test file, with lots of useful information, but not stuff that could suitable be placed into a DB, for example we could see all telnet conversations both to UUT & thier replies, or all GPIB conversattions.

    We ran our test software in a foreign manufacturing plant and sometimes when there were problems we would ask them to turn logging on (a simple menu option) and get them to send back the log file as we could not debug on the remote executable.

    I do the same and insist on result data as well. I think you've just picked up on the error bit because of my last comment (my bad), but previously I did say log file with info, warnings and debug so I think we are on the same page. If the log table is in the same DB as the results then you get them by default when they send the file. A few MB is nothing really in the scheme of things and it makes no difference in performance for a database of couple of GB. Of course, with text files you would really be struggling even with 10s of MB,

    As to what you save in the log table, well that's just down to your category partitioning. The sort of info (comms etc) that you describe, for me, would be "debug" and only as and when required. Maybe you would just have an extra category "Comms" since categories are not mutually exclusive, But I would still want errors, warnings and info logged during normal operation and over extremely long periods.

    Because you can handle such large data files you can leave error, warning and info logging enabled permanently and just switch in the "debug" for all the low level stuff as and when required.You then get useful things like how often they restarted the machine, what operators were logged in when the errors happened, if there were any warnings before the errors occurred, any alarms emitted etc. And all filterable :) Of course. Errors should be minimal if the software is working as intended. So it's really info and usage I would primarily be interested in and I request customers send me the DB file every month for the first 6 months so I can see how it is being used/abused and what can be done to improve it. Quality departments love the info too since you are logging calibration and tool-change info over time and they can run data through their 6 sigma software ;)

    We're utilizing TDMS for for results, but I really like the idea of SQLite of error/warning/whatever logging. Has anybody tried to tie the two together? I think you can stuff a blob in a TDMS, so you could include your database in the TDMS if you wanted, but that seems a little hacky.

    I'm not sure I like the idea of including a database in a database. I don't really see the point since it wouldn't be searchable from the TDMS. Like with most things I prefer to stick with one technology rather than mix, If I were to consider it, I think I would just keep the Sqlite file separate or include the errors/info in the TDMS (SQLite cannot beat TDMS for streaming).

    • Like 2
  9. I think the Database idea is great for errors, we did something similar in adding error messages to our test results report and they were imported into our DB wih all the oher results.

    However that does not work for general logging of which error loging is only a small subset off

    Not sure I quite follow you here.

    If you are already using a DB for results, then just adding an error table is a no-brainer, The only difference is the db name that you log the error to. You also get the advantage that you can link a specific row (or test if you like) with one or more errors, info, warnings etc giving you greater granularity.

  10. I think we all have something similar in our toolkit (although probably with not as many interfaces). However, a while ago mine got a face-lift to use a SQLite database instead of text files. The fact that you cannot open it in a text editor is far outweighed by the extra features like being able to filter errors to only show errors, info and/or errors containing certain text. It also means you can have much larger log files since after a program has been in the field a while, text editors struggle to open them. It also makes long term statistical analysis of the files much more agreeable.

    • Like 2
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.