Jump to content

John Lokanis

Members
  • Posts

    797
  • Joined

  • Last visited

  • Days Won

    14

Posts posted by John Lokanis

  1. Ok, tried this in my actual application.  Didn't work.  Might be because my application has this VI as a floating window.  Or might be some other effect.

    I tried making the example VI floating as well and found the sometimes the key focus did not work.  Seems to be flaky.

    Does it matter if I set this from a sub-vi by passing in the ref to the string element of the array?

    Also, does it matter if I initialize the array with empty strings after I set the key focus?

  2. Thanks Darren.  Didn't know you have to get a property node for the string in the array.  I figured that the Array Element.Key Focus would be the same thing.

    BTW: the panel activation is not needed, apparently.  I got it to work without it.

    One issue: it seems to put the focus on the last element you clicked in before running the VI.  So, if you click in element 3 while in edit mode, then end text entry without typing anything, when you run the VI, the third element will have the cursor in it.

    Is there any way to control which element in the array gets the key focus?

     

    And is any of this documented anywhere?  (the bit about getting the ref to the element control)

  3. I am trying (and failing) to set the key focus to the first element of an array control.  I want my dialog window to appear and have the cursor placed in the first element of a specific array control so the user can just start typing without first having to click on the element in the array.

    This seems like it would be a simple things to do, so I am hoping that I am just making some bonehead mistake.

    I have tried setting key focus, array element key focus, selection start and selection size.  None have worked.

     

    Anyone know if this is possible?

     

    I have attached an example VI with my attempts.

    array key focus.vi

     

    thanks for any wisdom...

     

    -John

  4. Just to be clear, I am using the "Mouse Down?" event to perform the sort.  I do not have custom code to implement the resize.  That is provided by the MCLB control.

    So, if I was to use the mouse move, how would I accomplish that?  The act of clicking anywhere on the control triggers the "Mouse Down?" event and the sort runs if the mouse is in the header.  I need to suppress this if I am doing a resize.  If I use "Mouse Move" that will trigger every time the mouse changes position over the control regardless of clicking.  How could I isolate that to suppress the sort while still allowing the click of the header to trigger the sort?

  5. I am stumped.  I created a MCLB that allows you to sort the data based on the column the user clicks on.  This is pretty simple.  Just trap the mouse down event and check if it is in the header row.  If so, find the column clicked and sort the data, then discard the event.  But then I wanted to add the ability to resize the columns however this did not work because the event would get discarded after doing the sort.  So I stopped discarding the events. But now I end up sorting the data every time I try to resize a column.  So, I need some way to detect if I am hovering over a column separator and have the resize cursor displayed and then use this fact to suppress the sort on mouse down events.

    Any idea how to solve this?  Is it even possible?

     

    thanks for any insights...

     

    -John

  6. I hope to keep the memory footprint down, but since the application is a test system that simultaneously tests 100's of DUTs in parallel (each DUT getting it's own instance of a test executive), the data consumption can add up.

    The current system uses ~9MB per DUT plus overhead of 66MB for the whole system.  I suspect the new system will exceed this a bit.  So, assuming 100MB of overhead and 10MB per DUT, that puts me at 5.1GB for 500 DUTs (that is my target maximum).

    So, it is possible that I could benefit from a larger memory space.  Need to get the new system completed and do some testing to confirm this.

  7. I currently develop my application in Windows 7 using 32bit LabVIEW 2014.  The IT department wants me to deploy to VMs going forward.  And they want the VM OS to be Windows Server 2012 R2 (64-Bit).

     

    Does anyone use the 64 bit version of LabVIEW?  If so, what OS do you use?

     

    Is there any issues with developing in the 32 bit version of LabVIEW but compiling with the 64 bit version for releases?

     

    I want to stick with 32 bit for dev work because some things like the desktop trace toolkit, unit test framework and VI Analyzer are not available for the 64 bit version.

     

    My I/O is limited to NI-VISA for TCP/IP communication, PSP for talking to cRIO over Ethernet and .NET calls for database and XML communication.  I do have some Fieldpoint hardware that I talk to via Datasocket but that could be moved to cRIO via PSP. From what I can tell, all of that should work with 64 bit LabVIEW.

     

    The application has 100's of parallel processes but does not collect large amounts of data.  Just lots of small chunks of data.  Would it benefit from a 64 bit environment?

     

    Also, the application is broken into two parts, a client and a server.  I use VI server to communicate between the two across the network.  If the client is a 32 bit LabVIEW application, can it use VI Server to talk to a different 64 bit LabVIEW application?

     

    Thanks for any tips or feedback,

     

    -John

     

     

     

  8. Yes, I typically have 40-100 parallel sub-systems, each with several threads (and at least one dedicated to .net calls to a DB) running at the same time.  Normally a call to the DB via .net executes in milliseconds so there is no issue.  But lately, the DB has been having issues of slow response and deadlocking, which I suspect is causing the .net calls to hang for a long time and starving my LV code of clock cycles.

    And yes, all the timer code is LV.  Actually, it is a pure LV system outside of the .net calls for DB access and some occasional XML reading.

     

    So, bumping the thread count seems like a good bandaid for the short term without having to recompile the exe.  Changing the execution system from same as caller to something else for the .net code might also help?  At some point I thought I remember hearing that LV will use the other execution systems automatically if one is overloaded, but perhaps I am remembering that incorrectly.

     

    Anyone know the ini strings to adjust the execution system threads?

     

    Oh, and does it matter how many cores the machine has or will the OS manage the threads across the cores on it's own?

     

    thanks,

     

    -John

  9. I am seeing strange behaviors in some of my code that looks like thread starvation.  I have some code that checks the timer, enters a loop and then at each iteration of the loop (should be 1 second or less) checks the timer to see if it exceeded the allowed time.  Occasionally, it will timeout with a duration that exceeds the limit by several minutes.  This appears to be as if the VI was frozen and could not iterate the loop for a long time.

    The same application has a lot of .NET calls to a database going on in parallel operations.  I am suspecting that those calls are taking a long time and stealing all the threads.

    I don't have any ini settings that change the thread count to something other than the default and all my VIs use the 'Same as Caller' execution system.

     

    So, my question is, how many simultaneous .net calls can I execute before all my VIs stop getting any CPU cycles? From the research I have done, it looks like 8, but I am not sure if that is correct for LV2011.

    Would it help to set the threads per execution system to a larger number in the ini file?

    Would it help to make my VIs that call .net execute in a different execution system?

     

    This application was built in LV2011 so I need to stick to that environment for now.

     

    thanks for any ideas.

     

    -John

  10. It's not that I didn't like it.  I am still trying to decide what path to take overall.  I just want maximum simplicity and maximum functionality.  Somewhere they intersect and I hope to figure out where for my needs.

    I did discuss the different options with Allen Smith at NI Week and the conclusion was for command pattern across a network, the abstraction (or interface) design was the cleanest or at least easiest to understand.

  11.  

    Hi John,
     
    Are you designing a generic framework, or a specific application?
    Both, actually.  I created a framework that is generic, but has features I need for a few specific applications, like network messaging and subscriptions.
    Then I started implementing a specific application with the framework.  That is where I have observed the dependency issue.

     

    Yes, the Class A tests will also cause Class B to be loaded and launched, but your test harness for Class A doesn't know that Class B is involved at all. Some might classify the Class A test as an Integration test rather than a Unit test, but the bottom line is that you can still treat Class A as an isolated black box.

    I agree and if I could limit it to just a few classes, that would be fine.  But these links build up quickly to the point now where if I load Actor B to test it, it loads A and C-Z!  To paraphrase that old STD PSA from the 80's-90's: if you are statically linked to one actor, you are linked to every actor they are linked to and every actor those other actors are linked to.

    I can actually open a new project, add one of my actor classes and watch as LabVIEW loads 80% of the Actors (and many of their messages) into the dependency list.

     

    I guess the real question is: should I care?  Is this dependency issue just annoying due to IDE slowness and difficulty in testing/developing in isolation or is it a fundamental flaw in the command-pattern message approach that makes it unworkable for anything other than small simple systems?

    You're right, the total number of classes increases. I should have written "..reduce the number of classes in your current project..". You'll of course have to implement the Actor in another project.

    This is basically how I solve the issue when sending messages between two applications.  But the level of complexity and extra classes it introduces would make the project even more difficult to deal with.  In reality, I would have to create an interface for every Actor and every message to isolate them.  It is already become difficult to follow the execution flow with no way to visualize the overall application other than how I arrange the project.  Adding abstract classes on top of everything would make that worse and lead to maintenance issues down the road.

     

    I am beginning to think there are many solutions but maybe no good ones.  Surprised that no AF people have stepped up and defended the use of command-pattern.  I would like to hear their thoughts.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.