Jump to content

Re-establishing TestStand Database Connections: Does anyone know exactly how TestStand maintains its database connnection?


Recommended Posts

TestStand Version(s) Used: 2010 thru 2016
Windows (7 & 10)
Database: MS SQL Server (v?)

Note: The database connection I'm referring to is what's configured in "Configure > Result Processing", (or equivalent location in older versions).

Based on some issues we've been having with nearly all of our TestStand-based production testers, I'm assuming that TestStand opens the configured database connection when the sequence is run, and maintains that same connection for all subsequent UUTs tested until the sequence is stopped/terminated/aborted.  However, I'm not sure of this and would like someone to either confirm or correct this assumption. 

The problem we're having is that: Nearly all of our TestStand-based production testers have intermittently been losing their database connections - returning an error (usually after the PASS/FAIL banner).  I'm not sure if it's a TestStand issue or an issue with the database itself. The operator - Seeing and only caring about whether it passed or failed, often ignores the error message and soldiers on, mostly ignoring every error message that pops up.  Testers at the next higher assembly that look for a passed record of the sub assemblies' serial number in the database will now fail their test because they can't find a passed record of the serial number. 

We've tried communicating with the operators to either let us know when the error occurs, re-test the UUT, or restart TestStand (usually the option that works), but it's often forgotten or ignored.

The operators do not stop the test sequence when they go home for the evening/weekend/etc. so, TestStand is normally left waiting for them to enter the next serial number of the device to test.  I'm assuming that their connection to the database is still opened during this time.  If so, it's almost as though MS SQL has been configured to terminate idle connections to it, or if something happens with the SQL Server - the connection hasn't been properly closed or re-established, etc. 

Our LabVIEW based testers don't appear to have this problem unless there really is an issue with the database server.  The majority of these testers I believe open > write > close their database connections at the completion of a unit test. 

I'm currently looking into writing my own routine to be called in the "Log to Database" callback which will open > write > close the database connection.  But, I wanted to check if anyone more knowledgeable had any insight before I spend time doing something that may have an easy fix.

Thanks all!

Edited by Bryan
Link to comment
  • 2 weeks later...

I figured I would post a resolution to my own problem for anyone who may run into the same thing. 

While searching for something unrelated today, I stumbled across my answer.  TestStand creates a connection to the configured database with the first UUT that is tested and maintains that connection while UUTs are continuously tested.  Only when the sequence is stopped is when the database connection is released.  I was able to come up with my solution by essentially compiling what I found in my searches.

In my situation - "hiccups" in network connectivity or with the SQL server cause that connection to become invalid.  This is why we get our database logging errors and lose logged data.  I had read about the offline processing utility, but I wanted something simpler such as a LabVIEW test step that I could drop into a sequence that would handle things gracefully instead of setting up offline processing configurations on all of the testers. 

After the first time the "Log to Database" callback is run, the datalink object becomes available at: "Runstate.Execution.TSDatabaseLoggingDatalink{GUID}".  (GUID will vary).  

My solution (in my testbed so far) is that I created a VI that locates that object and writes a null value to it. (Empty variant constant).  This appears to force TestStand to re-establish the database connection when each subsequent UUT is tested.  This makes sure that when the test sequence sits in limbo for hours or days (plenty of time for DB/Network "hiccups"), that the next time a UUT is run, the data will be logged - assuming the DB and Network are up and running when called.  I have yet to put this method to the test on one of the actual testers to validate it, but it seems promising in my "simulated environment", (we all know how that sometimes goes).  I should also mention that this should go at the end of the "Log To Database" sequence callback.

I wish I could share the VI that I created, but I'm not sure of my company's policy in doing so, but my description above should be enough to get people on the right path.

Edited by Bryan
  • Like 1
  • Thanks 1
Link to comment

I just found out this morning that "Runstate.Execution.TSDatabaseLoggingDatalink{GUID}" isn't a valid location for TestStand 2010 (and not sure what other versions).  For 2010 at least, it is "Runstate.Execution.TSDatabaseLoggingDatalink{Database Schema Name}".

After a lot of futzing around yesterday, I simply added a statement expression step after "Log to Database" in the "Log to Database" callback sequence.  Note: I didn't want to create a new local variable in the sequences of each tester, so I used Step.Result.Status to temporarily hold my value (Probably not a good practice, haha):

Expression that works for TestStand 2010 (Not sure about other versions) (Using Parameters.DatabaseOptions.DatabaseSchema.Name and replacing all "\s" with "_":

// Assign the GUID value to our step status for temporary local variable (Evaluate won't work unless this is done for some reason)
Step.Result.status = Parameters.DatabaseOptions.DatabaseSchema.Name,

// Replace all "-" with "_" in the GUID and build path to datalink object... then set it's value to NOTHING (Destroys DB reference forcing TestStand to re-open it next UUT run.)
Evaluate("RunState.Execution.TSDatabaseLoggingDatalink" + SearchAndReplace(Step.Result.Status," ","_") + "= Nothing" ),

// Update  step status to what it would normally be if we hadn't had to use it as a temporary local variable.
Step.Result.Status = "Done"


Expression that works for at LEAST TestStand 2013 and 2016 (Using Parameters.ModelPlugin.Base.GUID and replacing all "-" with "_"):

// Assign the GUID value to our step status for temporary local variable (Evaluate won't work unless this is done for some reason)
Step.Result.status = Parameters.ModelPlugin.Base.GUID,

// Replace all "\s" with "_" in the GUID and build path to datalink object... then set it's value to NOTHING (Destroys DB reference forcing TestStand to re-open it next UUT run.)
Evaluate("RunState.Execution.TSDatabaseLoggingDatalink" + SearchAndReplace(Step.Result.Status,"-","_") + "= Nothing" ),

// Update  step status to what it would normally be if we hadn't had to use it as a temporary local variable.
Step.Result.Status = "Done"


Edit: I found today that one of our TS2016 installations used the same method as I had posted in the TS2010 verison above.  The DatabaseSchema.Name property had a combination of periods, spaces and dashes.  All of these I had to do a "SearchAndReplace" with underscores.  I'm not currently sure what's causing the inconsistencies between versions (using the GUID vs Schema Name), but so far I've had to tweak each one on a case by case basis. 

Edited by Bryan
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Content

    • By kosist90
      Dear Community,
      let me present our new ANV Database Toolkit, which has been recently released at vipm.io.
      Short introduction to the toolkit is posted by this link, and it also describes steps which should be done in order to use this toolkit.
      ANV Database Toolkit helps developers design LabVIEW API for querying various databases (MS SQL, MySQL, SQLite, Access). It allows to create VIs which can be used as API with the help of graphical user interface. When using these VIs, toolkit handles connection with the database, thus relieving developers of this burden in their applications.
      It has the following features:
      Simplifies handling of databases in LabVIEW projects Allows to graphically create API VIs for Databases Supports Read, Write, Update and Delete queries Supports various database types (MS SQL, MySQL, SQLite, Access) Overall idea is that developer could create set of high-level API VIs for queries using graphical user interface, without actual writing of SQL queries. Those API VIs are used in the application, and handle database communication in the background. Moreover, SQL query could be applied to any of the supported database types, it is a matter of database type selection. Change of target database does not require changes in API VI which executes the query.
      After installation of the toolkit, sample project is available, which shows possibilities of the toolkit in terms of execution different types of queries.
      Note, that in order to install the toolkit, VI Package Manager must be launched with Administrator privileges.
      This toolkit is paid, and price is disclosed based on price quotation. But anyway, there are 30 days of trial period during which you could tryout the toolkit, and decide whether it is helpful (and hope that it will be) for your needs.
      In case of any feedback, ideas or issues please do not hesitate to contact me directly here, or at vipm.io, or at e-mail info@anv-tech.com.
    • By Srinivas Iyer
      I am trying to do the following (LabVIEW2019, TestStand2019, Windows 10)
      Create a new sub-property under RunState.Engine.TemporaryGlobals as Measurements.TestName_DateTimeStamp 
      My TS expression:
       //get a handle to runstate.engine.temporaryglobals
      Locals.tmp_prop_object = RunState.Engine.TemporaryGlobals,
      //create a new obj ref type property under this
      Locals.tmp_prop_object.AsPropertyObject.NewSubProperty(Locals.name1, PropValType_Reference, False, "",0),
      where Locals.name1 is Measurements.TestName_DateTimeStamp (e.g: RunState.Engine.TemporaryGlobals.Measurements.RxGain12_16_58AM)
      I now want to assign a LabVIEW class object to this property as:
      RunState.Engine.TemporaryGlobals.Measurements.RxGain12_16_58AM = Locals.tmp_obj_ref
      where Locals.tmp_obj_ref is the output terminal of a LabVIEW class and tmp_obj_ref is a TestStand variable of type Object Reference
      If I do it like this in the expression window, I can see that my RunState.Engine.TemporaryGlobals.RxGain12_16_58AM has been assigned a LabVIEW class object
      However, the value RxGain12_16_58AM is known to me only at run-time, so I cannot hard-code the assignment expression as above.
      Any pointers on how to achieve this?
      Basically if,
      Locals.name = "RxGain12_16_58AM" (created at run-time)
      then i want to be able to assign, during run-time:
      RunState.Engine.TemporaryGlobals.RxGain12_16_58AM = Locals.tmp_obj_ref

    • By MRedRaider
      15. FIFTEEN. That's the number of current job postings  in my group. We're hiring! This is the most fulfilling, challenging, and rewarding position I've held. Job specific requirements listed on the website.

    • By ATE-ENGE
      I've been using LabVIEW for a few years for automation testing tasks and until recently have been saving my data to "[DescriptorA]\[DescriptorB]\[test_info].csv" files. A few months ago, a friend turned me on to the concept of relational databases, I've been really impressed by their response times and am reworking my code and following the examples with the Database Connectivity Toolkit (DCT) to use "[test_info].mdb" with my provider being a Microsoft jet oldb database.
      However, I'm beginning to see the limitations of the DCT namely:
      No support for auto-incrementing primary keys No support for foreign keys Difficult to program stored procedures and I'm sure a few more that I don't know yet.
      Now I've switched over to architecting my database in MySQL Workbench. Suffice to say I'm a bit out of my depth and have a few questions that I haven't seen covered in tutorials
       Questions (General):
       Using Microsoft jet oldb I made a connection string "Data Source= C:\[Database]\[databasename.mdb]" in a .UDL file. However, the examples I've seen for connecting to MySQL databases use IP addresses and ports.
      Is a MySQL database still a file? If not, how do I put it on my networked server \\[servername\Database\[file]? If so, what file extensions exist for databases and what is the implication of each extension? I know of .mdb, but are there others I could/should be using (such as .csv's vs .txt's)  My peers, who have more work experience than me but no experience with databases, espouse a 2GB limit on all files (I believe from the era of FAT16 disks). My current oldb database is about 200mB in size so 2GB will likely never happen, but I'm curious:
      Do file size limits still apply to database files? If so, how does one have the giant databases that support major websites?  Questions (LabVIEW Specific):
      I can install my [MainTestingVi.exe], which accesses the jet oldb database, on a Windows 10 computer that is fresh out of the box. When I switch over to having a MySQL database, are there any additional tools that I'll need to install as well? 
    • By GregFreeman
      I think I have found a fundamental issue with the DB Toolkit Open connection. It seems to not correctly use connection pooling. The reason I believe it's an issue with LabVIEW and ADODB ActiveX specifically is because the problem does not manifest itself using the ADODB driver in C#. This is better shown with examples. All I am doing in these examples is opening and closing connections and benchmarking the connection open time.
      Adodb and Oracle driver in LabVIEW.

      ADODB in C#
      namespace TestAdodbOpenTime { class Program { static void Main(string[] args) { Stopwatch sw = new Stopwatch(); for (int i = 0; i < 30; i++) { ADODB.Connection cn = new ADODB.Connection(); int count = Environment.TickCount; cn.Open("Provider=OraOLEDB.Oracle;Data Source=FASTBAW;Extended Properties=PLSQLRSet=1;Pooling=true;", "USERID", "PASSWORD", -1); sw.Stop(); cn.Close(); int elapsedTime = Environment.TickCount - count; Debug.WriteLine("RunTime " + elapsedTime); } } } } Output:
      RunTime 203
      RunTime 0
      RunTime 0
      RunTime 0
      RunTime 0
      RunTime 0
      RunTime 0
      RunTime 0
      RunTime 0
      Notice the time nicely aligns between the LabVIEW code leveraging the .NET driver and the C# code using ADODB. The first connection takes a bit to open then the rest the connection pooling takes over nicely and the connect time is 0. 
      Now cue the LabVIEW ActiveX implementation and every open connection time is pretty crummy and very sporadic. 
      One thing I happened to find out by accident when troubleshooting was if I add a property node on the block diagram where I open a connection, and if I don't close the reference, my subsequent connect times are WAY faster (between 1 and 3 ms). That is what leads me to believe this may be a bug in whatever LabVIEW does to interface with ActiveX.
      Has anyone seen issues like this before or have any idea of where I can look to help me avoid wrapping up the driver myself?
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.