I am looking to determine the time it takes to insert each batch (500rows) of data into a database, i.e. the loop execution time. See the block diagram below.
I've place a timer outside the while loop and subtracted it from the timer inside the loop, using shift registers to carry forward the start time. I place the timer in a sequence structure to make sure it starts before the code runs.
When I used this method on a simple example - a while loop with wait function, the loop execution time returned the wait time as expected. But in the database application, the loop execution time value continues to increase. Where am I going wrong?
Thanks in advance,
Lisa