Jump to content

Database Connectivity Toolkit Multi Row Insert


Recommended Posts

Hi all. I recently got the Database Connectivity Toolkit (DBCT) on LabVIEW 2016 and I'm trying to insert multiple rows into a database.

Using a sine signal input I attempted to insert 50 rows in one iteration, but every method I tried returned 50 values in one row. Using the end of line constant and array to spreadsheet string, returned what looked like 50 rows, but when I queried "SELECT COUNT(*) FROM.." the no of rows was equal to the number of iterations.

On another forum post, someone suggested it is not possible to insert multiple rows using the DBCT. Can anyone confirm this?

I have written some code which allows me to insert 3 rows in one iteration, under a single connection. However, this method is very tedious and wouldn't work for large amounts of data.

 

DB Connectivity (Multi Row Insert).png

 

Can anyone think of a better way to do this?

 

Thanks in advance,

 

Lisa

Link to comment

Hi Mikael, thank your suggestion and help!

I managed to connect to my db using open and close connection invoke nodes and execute a query using ADO execute. Thanks very much indeed. My only question: do you know if ADO compatible with Linux Real-Time cRIO?

Thanks for your time, it is much appreciated.

Lisa

 

 

Edited by lisam
Link to comment

damn, that seemed like a great method. Do you know of any method that will work on linux?

I've been following a tutorial for LabVIEW to PostgreSQL for Linux RT using the system exec VI, (https://decibel.ni.com/content/docs/DOC-30308) which works but it opens and closes the connection every time data is inserted (unless I string all the values together first - which isn't very efficient). Another suggestion is using the Call Library Function Node - but I can't find any examples with a PostgreSQL API.

 

Link to comment
  • 2 weeks later...
On 1-11-2016 at 2:01 PM, lisam said:

Hi Mikael, thank your suggestion and help!

I managed to connect to my db using open and close connection invoke nodes and execute a query using ADO execute. Thanks very much indeed. My only question: do you know if ADO compatible with Linux Real-Time cRIO?

Thanks for your time, it is much appreciated.

Lisa

Of course not! ADO stands for ActivX Database Objects and ActiveX is a Windows only technology.

Depending of the actual database server you want to access there are several possibilities but not all of them are readily doable in LabVIEW for Linux.

If your database driver is implementing the whole communication directly in the LabVIEW VI level such as the MySQL driver here which access the MySQL server directly through TCP/IP communication, then you are fine.

Accessing the unixODBC driver is another possibility which keeps the LabVIEW part independent of the actual database driver implementation. This project does provide such a LabVIEW library however, it is not always easy to get a working ODBC driver for a specific database server. Microsoft officially supports Linux clients with their latest SQL server but I have not tried that at all, and if you talk about the NI Linux realtime targets an additional problem is the architecture (ARM based for the low cost targets and x64 based for the high end targets) and the fact that NI Linux RT isn't a normal standard Linux system but in several aspects a slimmed down Linux kernel that some precompiled binaries may not work on, and to expect Microsoft to give you the source code of their SQL server libraries to compile your own binaries for a specific target is of course pretty hopeless.

Edited by rolfk
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Content

    • By kosist90
      Dear Community,
      let me present our new ANV Database Toolkit, which has been recently released at vipm.io.
      Short introduction to the toolkit is posted by this link, and it also describes steps which should be done in order to use this toolkit.
      ANV Database Toolkit helps developers design LabVIEW API for querying various databases (MS SQL, MySQL, SQLite, Access). It allows to create VIs which can be used as API with the help of graphical user interface. When using these VIs, toolkit handles connection with the database, thus relieving developers of this burden in their applications.
      It has the following features:
      Simplifies handling of databases in LabVIEW projects Allows to graphically create API VIs for Databases Supports Read, Write, Update and Delete queries Supports various database types (MS SQL, MySQL, SQLite, Access) Overall idea is that developer could create set of high-level API VIs for queries using graphical user interface, without actual writing of SQL queries. Those API VIs are used in the application, and handle database communication in the background. Moreover, SQL query could be applied to any of the supported database types, it is a matter of database type selection. Change of target database does not require changes in API VI which executes the query.
      After installation of the toolkit, sample project is available, which shows possibilities of the toolkit in terms of execution different types of queries.
      Note, that in order to install the toolkit, VI Package Manager must be launched with Administrator privileges.
      This toolkit is paid, and price is disclosed based on price quotation. But anyway, there are 30 days of trial period during which you could tryout the toolkit, and decide whether it is helpful (and hope that it will be) for your needs.
      In case of any feedback, ideas or issues please do not hesitate to contact me directly here, or at vipm.io, or at e-mail info@anv-tech.com.
       
        
    • By Bryan
      TestStand Version(s) Used: 2010 thru 2016
      Windows (7 & 10)
      Database: MS SQL Server (v?)
      Note: The database connection I'm referring to is what's configured in "Configure > Result Processing", (or equivalent location in older versions).
      Based on some issues we've been having with nearly all of our TestStand-based production testers, I'm assuming that TestStand opens the configured database connection when the sequence is run, and maintains that same connection for all subsequent UUTs tested until the sequence is stopped/terminated/aborted.  However, I'm not sure of this and would like someone to either confirm or correct this assumption. 
      The problem we're having is that: Nearly all of our TestStand-based production testers have intermittently been losing their database connections - returning an error (usually after the PASS/FAIL banner).  I'm not sure if it's a TestStand issue or an issue with the database itself. The operator - Seeing and only caring about whether it passed or failed, often ignores the error message and soldiers on, mostly ignoring every error message that pops up.  Testers at the next higher assembly that look for a passed record of the sub assemblies' serial number in the database will now fail their test because they can't find a passed record of the serial number. 
      We've tried communicating with the operators to either let us know when the error occurs, re-test the UUT, or restart TestStand (usually the option that works), but it's often forgotten or ignored.
      The operators do not stop the test sequence when they go home for the evening/weekend/etc. so, TestStand is normally left waiting for them to enter the next serial number of the device to test.  I'm assuming that their connection to the database is still opened during this time.  If so, it's almost as though MS SQL has been configured to terminate idle connections to it, or if something happens with the SQL Server - the connection hasn't been properly closed or re-established, etc. 
      Our LabVIEW based testers don't appear to have this problem unless there really is an issue with the database server.  The majority of these testers I believe open > write > close their database connections at the completion of a unit test. 
      I'm currently looking into writing my own routine to be called in the "Log to Database" callback which will open > write > close the database connection.  But, I wanted to check if anyone more knowledgeable had any insight before I spend time doing something that may have an easy fix.
      Thanks all!
    • By Gepponline
      Hi,
       I'm trying to insert some NULL values in a datetime field.
      In the example, the DATA_INSERIMENTO field has a non empty value and it works correctly but DATA_INTERVENTO doesn't accept NULL.
      If I use an empty string instead of null, the VI run without any errors but it fill the database field with a 1900-01-01 and it's not what I want.


       
      If I use the DB Tools NULL VI it gives me another type of error, maybe 'cause I'm connecting a variant to a cluster of string.
      If i use a Variant to Data VI for the NULL value it returns an empty string so not the result I need.
       

      If use the string you see in the label at the bottom of my diagram in SQL Server manager, it works correctly.
      How can I obtain the same result with labview?
    • By ATE-ENGE
      Background:
      I've been using LabVIEW for a few years for automation testing tasks and until recently have been saving my data to "[DescriptorA]\[DescriptorB]\[test_info].csv" files. A few months ago, a friend turned me on to the concept of relational databases, I've been really impressed by their response times and am reworking my code and following the examples with the Database Connectivity Toolkit (DCT) to use "[test_info].mdb" with my provider being a Microsoft jet oldb database.
      However, I'm beginning to see the limitations of the DCT namely:
      No support for auto-incrementing primary keys No support for foreign keys Difficult to program stored procedures and I'm sure a few more that I don't know yet.
      Now I've switched over to architecting my database in MySQL Workbench. Suffice to say I'm a bit out of my depth and have a few questions that I haven't seen covered in tutorials
       Questions (General):
       Using Microsoft jet oldb I made a connection string "Data Source= C:\[Database]\[databasename.mdb]" in a .UDL file. However, the examples I've seen for connecting to MySQL databases use IP addresses and ports.
      Is a MySQL database still a file? If not, how do I put it on my networked server \\[servername\Database\[file]? If so, what file extensions exist for databases and what is the implication of each extension? I know of .mdb, but are there others I could/should be using (such as .csv's vs .txt's)  My peers, who have more work experience than me but no experience with databases, espouse a 2GB limit on all files (I believe from the era of FAT16 disks). My current oldb database is about 200mB in size so 2GB will likely never happen, but I'm curious:
      Do file size limits still apply to database files? If so, how does one have the giant databases that support major websites?  Questions (LabVIEW Specific):
      I can install my [MainTestingVi.exe], which accesses the jet oldb database, on a Windows 10 computer that is fresh out of the box. When I switch over to having a MySQL database, are there any additional tools that I'll need to install as well? 
    • By GregFreeman
      I think I have found a fundamental issue with the DB Toolkit Open connection. It seems to not correctly use connection pooling. The reason I believe it's an issue with LabVIEW and ADODB ActiveX specifically is because the problem does not manifest itself using the ADODB driver in C#. This is better shown with examples. All I am doing in these examples is opening and closing connections and benchmarking the connection open time.
      Adodb and Oracle driver in LabVIEW.

       
      ADODB in C#
       
      namespace TestAdodbOpenTime { class Program { static void Main(string[] args) { Stopwatch sw = new Stopwatch(); for (int i = 0; i < 30; i++) { ADODB.Connection cn = new ADODB.Connection(); int count = Environment.TickCount; cn.Open("Provider=OraOLEDB.Oracle;Data Source=FASTBAW;Extended Properties=PLSQLRSet=1;Pooling=true;", "USERID", "PASSWORD", -1); sw.Stop(); cn.Close(); int elapsedTime = Environment.TickCount - count; Debug.WriteLine("RunTime " + elapsedTime); } } } } Output:
      RunTime 203
      RunTime 0
      RunTime 0
      RunTime 0
      RunTime 0
      RunTime 0
      RunTime 0
      RunTime 0
      RunTime 0
       
      Notice the time nicely aligns between the LabVIEW code leveraging the .NET driver and the C# code using ADODB. The first connection takes a bit to open then the rest the connection pooling takes over nicely and the connect time is 0. 
       
      Now cue the LabVIEW ActiveX implementation and every open connection time is pretty crummy and very sporadic. 
       
      One thing I happened to find out by accident when troubleshooting was if I add a property node on the block diagram where I open a connection, and if I don't close the reference, my subsequent connect times are WAY faster (between 1 and 3 ms). That is what leads me to believe this may be a bug in whatever LabVIEW does to interface with ActiveX.
       
      Has anyone seen issues like this before or have any idea of where I can look to help me avoid wrapping up the driver myself?
       
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.