Jump to content

hfettig

Members
  • Posts

    109
  • Joined

  • Last visited

Posts posted by hfettig

  1. I have an automated build server (Jenkins) that uses the LabVIEW IDE to build my code every time I commit it into SVN.

    This works well most of the time but every so often it gets stuck on start-up of LabVIEW because the recovery dialog pops up and waits for user input.

    Is there a way this can be suppressed?

     

    I was wondering if starting the automated build version with a separate INI file would solve this problem. Any ideas?

     

    Heiko

  2. This was taken at the CLA summit.

    Heiko may remember better than I but we were either watching FRC videos or he was showing me his pig farm application.

    Actually this was taken during the NI Week 2009 'Challenge the Champions' game :-)

    I am still itching for a rematch. But I think next year we'll be on opposite teams ;-)

  3. Did you ever find a way around this problem?

    I can convert the horizontal coordinate without any problems but the vertical seems to depend on whether the toolbar is on, etc as well as on the windows theme.

    I am trying to convert the screen coordinates that I get from Acquire Input Data.vi, i.e. reading the mouse position, into Panel and then Pane and then XY Graph coordinates.

    Heiko

  4. I have an application that requires me to record mouse events (xy location vs time) and synchronize that with DAQ data that is acquired in parallel.

    I get the mouse events with the system tick counter as a time base and the DAQ data with the system timestamp as a time base.

    To synchronize the two I am calling the GetTickCount and the GetDateTime functions in a sequence frame and use those values as the reference points for my relative time base.

    Does that make sense? Is there an easier way?

    We are seeing a 500ms delay between DAQ and mouse data and have no idea where it comes from.

  5. I have a strange problem with an Agilent 8504B instrument.

    Here is what I do:

    • Open VISA resource
    • Clear VISA buffers
    • Write two commands in one message (Data Format, and Send Plot Data)
    • Read until all plot data has been received
    • Clear VISA buffers
    • Close VISA resource

    This works perfectly the first time I execute this.

    Sometimes it even works multiple times.

    Mostly though it hangs the second time I execute it. Specifically it hangs in the VISA Write command. On the instrument I can see that it is stuck in listening mode. I can wait forever and nothing happens The VISA write does not time out. When I try to close my program LV hangs. If instead I power cycle the instrument, the VISA write function exits with an error. Once I power cycle the everything works again for one time.

    I replaced the VISA API with the GPIB API and got the same results.

    Any ideas what could be happening here?

  6. The way I have designed my application now my seem a little convoluted but it serves multiple purposes:

    • I have a main controller QMH (Queued Message Handler) on the RT, which keeps all the information needed by the control system.
    • I have a TCP/IP Server loop which just sits there waiting for connections.
    • When it gets a connection it spawns Worker task (another QMH) that takes the TCP data, formats it into the QMH Message type and puts it into the controller queue along with the name of its own queue.
    • The controller QMH interprets the message, and puts the response onto the TCP Worker queue, which sends it via TCP/IP back to the client.

    This allows for concurrent connections from multiple web services. In addition the TCP/IP interface can also be used to interact with the controller from the host PC. All through one simple interface.

  7. I find myself re-doing the same steps very often when I create new projects, libraries, and classes. E.g. when I create a new library the first thing I do is add a private and a public folder and set the access scope. It would save me a lot of time if I could change the template used in the 'New Library' action to conform to what I want.

    Yes, I do know that I can create my own templates and store them in the templates directory, but then I have to go through the 'New...' dialog. I don't mind that for more elaborate templates that I might use from time to time but for things that I use a lot it is more convenient to right-click in the proper location in the project and select 'New -> Library'.

    Any ideas?

  8. I am designing a control system that will run on a cRIO (or sbRIO). The Main Instance will run the control system and communicate with the Host PC, however, there will also be an iPod Touch based user interface that will communicate with the cRIO using a Web Service that is deployed on the cRIO.

    Since the Web Service will have to modify some of the data used in the Main Instance I was wondering what the best way of exchanging data between these two application instances would be.

    If this were two VIs in the same Application Instance I would use queues and notifiers but is that possible across instances?

    I guess I could use network shared variables deployed on the cRIO but I was wondering if there was another method.

  9. Hi folks,

    I have used LabVOOP before, mainly to implement instrument drivers with base and child classes.

    Now I am looking into creating my first LabVOOP architecture and I have a few questions about some implementation details regarding the by Value functionality of LabVOOP.

    Say I have an object called Farm that contains an object called ListOfPigs, which handles an array of Pig objects. Now the Pig class has a method called RecordPiglets(BornAlive, StillBorn, Mummified).

    In C# I would call that function like this: Farm.Pig[tag].RecordPiglets(7,0,0)

    I have no problem getting to the Farm.Pig[tag] point in LabVOOP, i.e. extracting the Pig object in question. But if I now run the RecordPiglets method on that object I will have to update the Pig object in the ListOfPigs object, and then update that object in the Farm object.

    I could do that by creating an UpdateObject for every GetObject method in every class. However, that seems a bit unwieldy. Is there an easier way?

    Thanks for your help,

    Heiko

  10. That is not what this right click pop-up option is about. Scale Fit if I'm not mistaken is the austoscaling. Auto Adjust Scales is the fact that the scale area (and implied by that the plot area) is resized when the scale needs more space to be displayed. This is by default on, but can be switched of with the right click pop-up menu. There is up to 8.6 apparently no property to switch this on or off programmatically.

    Rolf Kalbermatter

    Sorry, I thought you were talking about the auto scaling. If the auto-scaling is turned off the scale markers should not change and therefore the left edge of the plot should not move.

    Never new about the 'Auto Adjust Scales' property. And you are correct there is no property for that in LV2009 either.

  11. Yeah, that's the only thing that works so far. It'd be nice if there was a property for that though.

    George

    You can do that programmatically. You can set the format string for the scale markers: Y Scale > Format String

    In newer versions of LabVIEW there should be a property calledsomething like "Auto Adjust Scales" or something. This is a Boolean andcan be set on or off. You can also set that property in the graphsright click pop-up menu under Advanced.

    EDIT: Seems I can't find that property. Probably must have dreamed about it. :blink:

    You can find the autoscale property here: Y Scale > Scale Fit

  12. I am working a lot with multi-column listboxes and tables lately and was wondering whether somebody new of a way to set cell colour (or other cell properties) for all 'body' cells at once.

    If I use -2,-2 for ActiveCell I can set properties for all cells including row and column headers, but is there a way to select all cells except the headers?

    Currently I read the properties for the headers, set the new properties for all cells, and then re-set the header properties.

    Just looking for a simpler way :rolleyes:

  13. Thanks for the suggestions.

    cFP is definitely too expensive for our remote monitoring.

    We are considering RT-ETS on a PC, however, once you have paid for the RT license and the PC you are in the same range as a sbRIO.

    We will not go wireless for any of the acquisition modules. We might go wireless for an iPod web app user interface. We have done that for a different application and it works well.

    Thanks for the Acromag link. I have found a few similar products in the $400-$500 range.

    There really seems to be a lack of DAQ equipment for low sample rate. I don't need MS/s, I could do with one sample every second. I am measuring the level of feed in a system feeding livestock. They don't feed that fast :-)

    If the remote stations were not as far away I would consider the low end NI USB DAQ like the 6008. But once you put a range extender for USB onto it your back in the $500 range.

    Reason were are looking towards LV RT is that the customer should have as little chance as possible to mess things up, i.e. close the program, reboot the PC, etc. We don't really need the real-time part of it.

    If we would abandon the idea of remote acquisition over ethernet would there be a way to condition a voltage signal (more or less constant) to be reliably read from a distance of 600ft (200m)?

    Thanks,

    Heiko

  14. I am working on a project that will require a central controller with some analog inputs and digital IO but also two to four remote 8 to 20 channel AI devices. For the central controller we are considering cRIO or sbRIO. The remote DAQ devices could be up to 200m from the central controller. All they need to do is read voltages on demand. No high speed required. However, I have to be able to talk to in from LV RT.

    I would welcome any suggestions.

    Cheers,

    Heiko

  15. QUOTE (menghuihantang @ Feb 25 2009, 12:29 PM)

    Basic I & II and some practice =====> CLAD

    Intermediate I&II and experience =====> CLD

    This is the way NI has set up the exams. When I took my CLD exam a few years back they wanted you to program something similar to the course project in Intermediate I, albeit not quite as complex.

    Just remember to program your code the way NI wants people to code, which is not necessarily the way most people (even good ones) do code.

    I am still not over their taking points from me for using the Tick Counter in my timing function. And yes, I do know that it will roll over. But that only matters once it has rolled over twice, which gives you about 50 days to work with. Much more than was required in the exam. :angry:

    Same goes for the multiple choice questions. The correct one is the one NI wants to hear not necessarily the one you ore other people think is the right one.

    The Advanced course is supposed to prepare you for the CLA exam.

    Hope that helps.

    Heiko

  16. Here is what I want to do:

    I have 240 records (cluster of data), which I want to randomly access and modify.

    The accessing part is easy but every time I try to change one of the records a new record is added to the end of the file. I do set the file position before calling the write function, but I just noticed in the help for the write function that it says that the file pointer is always set to the end of the file before writing.

    Is there a way to avoid that or will I have to go with my own custom binary format?

    Cheers,

    Heiko

  17. Hi,

    I was wondering if anybody had written an implementation of an NTP (Network Time Protocol) client in LabVIEW.

    I am trying to figure out the best way to synchronize the RTC on the Blackfin processor to the network time.

    Naturally I could just send a request to my host PC and get the time from there but there will always be the delay involved.

    From what I've read in the RFC for NTP this delay is taken into account.

    Cheers,

    Heiko

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.