Jump to content

MySQL and LabVIEW performance issues


Recommended Posts

Hi!

I have built a VI to aquire and save measurement data to a MySQL database. The data contains of 60 "ints and floats" and they are saved using the DB Tool Insert Data.VI. There is no problem with the functionality....except for the speed. When I enable the saving to database my software loop time increase from 8ms to ~50ms. When I turn off the saving the loop time decrease to 8ms again. The CPU power meter never goes above 15% of maximal use. The MySQL and LabVIEW-program runs on the same computer. I have LabVIEW 8.2.1 and Database Connectivity Toolbox.

So, I have a couple of questions.

1. Do you think I have over estimated the possibility of using LabVIEW and MySQL for this type of application? (60 ints/float @ 100 Hz)

2. Do you have any tip for increasing the performance of my SQL Insert statement?

I will run the control system for several hours so it will be quite a lot of data in the end. Thats why I would like to use a database. The database will also give me other great advantages compared to writing to a "text-file", ie search and extract data from other computers during runtime.

I'm quite new to this type of application so any advice will be gratefully accepted.

Best regards,

Mattias Ottosson

Link to comment

Hmm, high-speed streaming to database. Never been down that path, however I haven't been using the database toolkit either. I just use optimized SQL statements for the task at hand. I call the ADO functions to execute a query. I think there is some overhead in the database toolkit but can't check what the the VI you are calling does at the moment. I thought I had the toolkit installed but it appears to be gone now from my palette.

Link to comment

Your database performance will be a function of several factors, including:

1) Database structure -- indices will slow down writes but speed up reads, for highest speed you will want to have as minimal a set of keys as possible (primary key is important, foreign keys will add overhead so should be avoided), the table structure itself can be optimized based on your needs. Also, disk space allocation can be setup to auto grow as table grows, need to ensure that you have settings that don't cause the table to be constantly 'growing' (you want to grow in chunks, not continuously, if you grow continuously you are inefficient)

2) Writes -- there are different ways of writing to the database that are faster and slower (typically, gains realized in writing data will be lost when reading data in my experience so you should consider that writing data happens 1x, while reading happens over and over so focusing on the write is usually a bad design decision)

3) DB Toolkit Usage -- the LV Toolkit has some steps you can take for optimization such as executing with 'forward only' cursors, etc. You need to get into the nitty gritty/low level VIs and understand what settings are applicable for your application.

4) Application structure -- keep in mind that one way to maintain your acquisition speed is to acquire data in one loop and write data to the db in asynchronously in another loop (or program, or cpu, etc). So application structure will matter when it comes to the 'performance' of your application (including writing data and maintaining a high DAQ speed).

Link to comment

Usually the words "high-speed" and "database" don't go into the same sentence. I'd highly recommend splitting the control/deterministic code and writing to database code into separate loops. Use some sort of queuing/caching/data buffering mechanism to collect the data while the database writing code continues to write. Another possibility is to do a binary stream to disk with the data and then have a separate loop read that data and write it to the database.

As others have pointed out, you can try to steamline your SQL code or write a stored procedure for writing the data to a database and you can design the tables to be more efficient. However, I think the best and quickest route is to separate the two operations and let each go as quickly as it can. The database writing will be slower, but at least it won't slow down the rest of your code.

Link to comment

QUOTE(tmot @ Oct 23 2007, 12:57 AM)

Hi!

I have built a VI to aquire and save measurement data to a MySQL database. The data contains of 60 "ints and floats" and they are saved using the DB Tool Insert Data.VI. There is no problem with the functionality....except for the speed. When I enable the saving to database my software loop time increase from 8ms to ~50ms. When I turn off the saving the loop time decrease to 8ms again. The CPU power meter never goes above 15% of maximal use. The MySQL and LabVIEW-program runs on the same computer. I have LabVIEW 8.2.1 and Database Connectivity Toolbox.

So, I have a couple of questions.

1. Do you think I have over estimated the possibility of using LabVIEW and MySQL for this type of application? (60 ints/float @ 100 Hz)

2. Do you have any tip for increasing the performance of my SQL Insert statement?

I will run the control system for several hours so it will be quite a lot of data in the end. Thats why I would like to use a database. The database will also give me other great advantages compared to writing to a "text-file", ie search and extract data from other computers during runtime.

I'm quite new to this type of application so any advice will be gratefully accepted.

Best regards,

Mattias Ottosson

You can speed up the database write by using bulk insert statements. This can be done in 2 ways.

Method 1:

=======

1) Keep 2 loops. One loop for datalogging into a text file. Log all your values in a comma/delimiter seperated format

2) In the other loop, Bulk insert the CSV file into the database using Bulk insert from file command. This loop can run at a very slow rate.

Method 2:

=======

1) Keep 2 Loops. one loop for your control/dataacquistion. In that loop, enqueue the required data into a queue.

2) In the Other Loop, Dequeue all the elements in the queue and form a Insert Statement like below.

insert into myTable values (1,'Y',36.00),(2,'N',57.00),(row3),(row4),....(row n)

and execute this statement. This loop can run at a very slow rate.

Note: Method 2 is supported by MySQL but not supported by SQL Express.

In my system, 500 individual Insert Statement take 33 Second to execute but, A single Insert statement with 500 rows takes less than 300 milliSec to execute.

Link to comment

QUOTE(siva @ Oct 23 2007, 10:12 AM)

You can speed up the database write by using bulk insert statements. This can be done in 2 ways.

Method 1:

=======

1) Keep 2 loops. One loop for datalogging into a text file. Log all your values in a comma/delimiter seperated format

2) In the other loop, Bulk insert the CSV file into the database using Bulk insert from file command. This loop can run at a very slow rate.

Method 2:

=======

1) Keep 2 Loops. one loop for your control/dataacquistion. In that loop, enqueue the required data into a queue.

2) In the Other Loop, Dequeue all the elements in the queue and form a Insert Statement like below.

insert into myTable values (1,'Y',36.00),(2,'N',57.00),(row3),(row4),....(row n)

and execute this statement. This loop can run at a very slow rate.

Note: Method 2 is supported by MySQL but not supported by SQL Express.

In my system, 500 individual Insert Statement take 33 Second to execute but, A single Insert statement with 500 rows takes less than 300 milliSec to execute.

if the table contains key constarints and the data grows at a faster rate then it is better to run the database software in another machine. By this way the database application can use the resource efficiently without affecting the control/dataacquisition process for checking the constarints while row insertion. However the stability depends upon the network traffic for this approach.

Link to comment

Another way to speed up the inserts a little bit is to prepare the insert statement by using "DB Tools Create Parameterized Query.vi" once. Then use the command reference of this parameterized query each time you want execute the insert statement by setting the values for the insert with "DB Tools Set Parameter Value.vi" (for each value) and then using "DB Tools Execute Query.vi".

The insert statement for "DB Tools Create Parameterized Query.vi" has to be like "INSERT INTO TABLEXY (COLA, COLB, COLC) VALUES (?, ?, ?)".

The advantage is that the insert statement has to be prepared only once and is kept in the database server in a binary representation. Hence, the database server is able to execute the statement faster, because it doesn't have to check it for validity and compile it each time.

I agree to the others, that it is necessary to separate the DAQ task from the database insert task in some way.

Link to comment

One other suggestion: Since your app and the db are on the same computer, use a "shared memory" protocol instead of TCP/IP if you haven't already.

Also as mentioned above,

1) Separate DAQ and DB Loops

2) Initialize connections to DB outside loop

3) Use stored procedure for inserts

Can you post a snapshot of the code? and a SQL script of the create database commands?

Link to comment

First of all, thank you very much for your time. All of you have been really good support to me.

I have now added the front panel and block diagram for my test-VI. I have implemented the queue system and separate loops for producer and consumer. However, since the queue is building up faster then the consumer loop consumes values, the queue is building up quite fast and the disc starts working.

The test database table that I add data to is created by a simple:

create table test(aa int, bb char(15));

...I'm sure that this can be improved in some way.

I always open and close the connection to the database "outside the loop". However, it still takes some 40-50 ms to save the data to the database table - so, unfortunatly no progress to far. I currently just want to save the data.

Any more advise will be gratefully accepted.

Regards,

Mattias

Link to comment

I hate to ask the obvious, but what are your PC Specs. You may just be maxing out your processor and/or your bandwidth to you hard drive.

What kind of CPU Usage do you see in Task Manager when running this application? What about Physical Memory?

Probably a screenshot of this data during a VI Run would get us enough information to illiminate this as a possible snag.

EDIT.

I just re-read your post... 15%... still would like to know what your Memory is like.

Link to comment

QUOTE(tmot @ Oct 23 2007, 08:30 AM)

Any more advise will be gratefully accepted.

Regards,

Mattias

Can you please post a connection string for the "mattias" database ? Have you tried a stored procedure yet?

If you have anti-virus software running, you may want to turn that off and see if things speed up any.

Brian

Link to comment

Queueing will unfortunatly not solve my problems since the queue is groving faster then the "write-to-db-vi" - and I'll write data for hours in my case. I've now tested a lot of things and finally made a new VI consisting of only...

- Open DB

- Write one int and one string of seven letters

- Close DB

...the time it taks for writing data to the table is still 40-60ms. So, I now wonder, is this a normal case for you too? I mean, can one say that saving data to a MySQL-table over ODBC will take about 50ms in general?

Thanks for your time.

Regards,

Mattias

Link to comment

QUOTE(tmot @ Oct 25 2007, 04:56 PM)

...the time it taks for writing data to the table is still 40-60ms. So, I now wonder, is this a normal case for you too? I mean, can one say that saving data to a MySQL-table over ODBC will take about 50ms in general?

Maybe. Without knowing the specs on your computer, none of us can say what the throughput might be. I suggest you reread the suggestions in more detail. Writing once to the database might take 50ms. However, making a single call to a stored procedure that stores 1000 records might also take 50ms. I really think you should take a look at optimizing this from the database side. Write a stored procedure to write 1000 records and call it instead of the DB Tools Insert Data. Does the queue still grow unbounded? How long does it take to execute the stored procedure?

Link to comment

QUOTE(bbean @ Oct 26 2007, 09:05 AM)

You are right. I was just trying to show him a typical insert with SQL Server that I could throw together quickly. I downloaded mysql and the odbc driver and ran my program again and it is definitely slower with mysql. My hunch is that the ODBC driver actually uses Http instead of shared memory, but that is just a guess. Maybe tweaking the connection string/mysql setup/ stored procedures would help. The wierd thing is that my CPU usuage hovered around 2% whereas with SQL Server it hovered around 40%. Here are the results:

1

Something we do where I work is another similar but different alternative to what has been mentioned.

Create a separate file/files that get digested by another program (read LV exe) as the data points are taken.

That will allow you to prioritize the DAQ or the Logging also it provides robustness if the DAQ program crashes before all the data is logged.

0.02~,~

Link to comment

I have now also added a procedure to my database called AddData. I have managed to decrease my write time to about 33 ms by using this procedure instead of having a "insert into... etc" .

I have also added the possibility to save data to a simple text file. I'm having no problems at all when using this method, the write time is as low as 1ms or less. But when saving to the database the disc activity increases very much compared to when only writing to the text file. Do you know why the disc activity increases when using the database instead of a simple text file? I have measure my drive by using HD Tune and I see no problems with the disc performance.

Thanks in advance for any advice!

Regards,

Mattias

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.