Jump to content

SQLite Library 1.6 beta: Attributes lookup table


Recommended Posts

Attached is a beta version of the latest 1.6 version of SQLite Library, for anyone who like to give feedback.   A major addition (not yet well tested) is “Attributesâ€, modeled on Variant Attributes or Waveform Attributes, but stored in any SQLite db file.  The idea is to make it easy to store simple named parameters without much effort.  See the example “SQLite Attributes.viâ€.  

 

A more minor upgrade is making “Execute SQL†polymorphic, so as to return data in a number of forms in addition to a 2D-array of strings.  See the upgraded example “SQLite Example 1 — Create Table.vi†which uses the new polymorphic VI, including showing how to return results as a Cluster.

 

For Attributes, I had to make some choices in how to store the various LabVIEW types in SQLite’s limited number of types.   The format I decided on is:

1) all simple types that already have a defined mapping (i.e. a “Bind†property node) are stored as defined (so strings and paths are Text, DBLs and Singles are Float, integers (except U64) are Integers.

2) Timestamps are ISO-8601 Text (the most standardized format of the four possibilities)

3) Enums are stored as the item text as Text, rather than the integer value.  This seems the most robust against changes in the enum definition.

4) LVOOP objects are stored flattened in a Blob.

5) any other LV type is, contained in a Variant, flattened and stored in a Blob.  Using a flattened Variant means we store the type information and LabVIEW version.

 

post-18176-0-04139400-1448742257.png

 

drjdpowell_lib_sqlite_labview-1.6.0.51.vip

LabVIEW 2011-2015

 


The Attribute stuff grew out of a project where SQLite files held the data, one for each “Runâ€, and the Runs had lots of small bits of information that needed to be stored in addition to the bulk of the data.   When and where the measurement was taken, what the equipment setup was, who the Operator was, etc.  I purpose-made a name-value look-up table for this, but realized that such a table could be made into reusable “attributesâ€.

  • Like 2
Link to post
Share on other sites

Out of curiosity, does storing a path as a string present platform problems? That is if I store "foo\bar.txt" in Windows, are the underlying primitives smart enough to change it to "foo/bar.txt" on a mac?

  • Like 1
Link to post
Share on other sites

Out of curiosity, does storing a path as a string present platform problems? That is if I store "foo\bar.txt" in Windows, are the underlying primitives smart enough to change it to "foo/bar.txt" on a mac?

 

No, but I’ll add the same NI off-pallet VIs used elsewhere in the library to convert to/from Mac paths: “Path to Command Line String†and “Command Line String to Pathâ€.  Thanks.

 

Do you use LabVIEW on the Mac?  I haven’t in a long while and I could do with someone testing it.

 

— James

Link to post
Share on other sites

Glad I asked then, it was a genuine curiosity... I'm platform locked to Windows so I've never been sure how that is handled (other than under the hood paths aren't a simple string).

Sorry I can't help with the Mac testing.

Link to post
Share on other sites
  • 5 weeks later...

Problem found using LabVIEW for MAC OS X::

 

Comparing the behavior of an "empty path" of:

 

LabVIEW 2012 and LabVIEW 2013 - 32-bit

vs

LabVIEW 2014 - 64-bit   

 

LabVIEW 2014 - 64-bit  empty path is not really empty... it has the value of "/"

 

By consequence, connection.lvclass:Open (string).vi  will never execute the empty case to get the default OSX

installation of SQLite dylib
 
My suggestion is to change the case condition to : "", "/"
 
Easy to work around, but would be nice to have this fix as part of the package. Also, I can help with some future validation on OS X platform as I spend most of my time working with LabVIEW for MAC OS X
 
-Sergio
  • Like 1
Link to post
Share on other sites

Hi,

 

Any thoughts on incorporating the JSON1 functionality included with the latest release of SQLite (http://sqlite.org/json1.html)?%C2'> This new functionality is quite intriguing to me as it seems to allow SQLite to be a possible "best of both worlds" solution for relational and document oriented data management situations along the lines of what the latest version of PostgreSQL has.

 

Thanks,

-Dino

 

btw: great job on the existing code - works like a charm in our development sandbox at the moment!

Link to post
Share on other sites

Any thoughts on incorporating the JSON1 functionality included with the latest release of SQLite ( This new functionality is quite intriguing to me as it seems to allow SQLite to be a possible "best of both worlds" solution for relational and document oriented data management situations along the lines of what the latest version of PostgreSQL has.

The JSON SQL functions should be working in this beta (I think it includes the 3.9.0 dll with those functions).   I’ve only played around with them a little, but they do work.   

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Content

    • By kosist90
      Dear Community,
      let me present our new ANV Database Toolkit, which has been recently released at vipm.io.
      Short introduction to the toolkit is posted by this link, and it also describes steps which should be done in order to use this toolkit.
      ANV Database Toolkit helps developers design LabVIEW API for querying various databases (MS SQL, MySQL, SQLite, Access). It allows to create VIs which can be used as API with the help of graphical user interface. When using these VIs, toolkit handles connection with the database, thus relieving developers of this burden in their applications.
      It has the following features:
      Simplifies handling of databases in LabVIEW projects Allows to graphically create API VIs for Databases Supports Read, Write, Update and Delete queries Supports various database types (MS SQL, MySQL, SQLite, Access) Overall idea is that developer could create set of high-level API VIs for queries using graphical user interface, without actual writing of SQL queries. Those API VIs are used in the application, and handle database communication in the background. Moreover, SQL query could be applied to any of the supported database types, it is a matter of database type selection. Change of target database does not require changes in API VI which executes the query.
      After installation of the toolkit, sample project is available, which shows possibilities of the toolkit in terms of execution different types of queries.
      Note, that in order to install the toolkit, VI Package Manager must be launched with Administrator privileges.
      This toolkit is paid, and price is disclosed based on price quotation. But anyway, there are 30 days of trial period during which you could tryout the toolkit, and decide whether it is helpful (and hope that it will be) for your needs.
      In case of any feedback, ideas or issues please do not hesitate to contact me directly here, or at vipm.io, or at e-mail info@anv-tech.com.
       
        
    • By drjdpowell
      View File SQLite Library
      Introductory video now available on YouTube: Intro to SQLite in LabVIEW
       
      SQLite3 is a very light-weight, server-less, database-in-a-file library. See www.SQLite.org. This package is a wrapper of the SQLite3 C library and follows it closely.

      There are basically two use modes: (1) calling "Execute SQL" on a Connection to run SQL scripts (and optionally return 2D arrays of strings from an SQL statement that returns results); and (2) "Preparing" a single SQL statement and executing it step-by-step explicitly. The advantage of the later is the ability to "Bind" parameters to the statement, and get the column data back in the desired datatype. The "Bind" and "Get Column" VIs are set as properties of the "SQL Statement" object, for convenience in working with large numbers of them.

      See the original conversation on this here.
      Hosted on the NI LabVIEW Tools Network.
      JDP Science Tools group on NI.com.
      ***Requires VIPM 2017 or later for install.***
      Submitter drjdpowell Submitted 06/19/2012 Category Database & File IO LabVIEW Version 2013 License Type BSD (Most common)  
    • By drjdpowell
      Introductory video now available on YouTube: Intro to SQLite in LabVIEW
       
      SQLite3 is a very light-weight, server-less, database-in-a-file library. See www.SQLite.org. This package is a wrapper of the SQLite3 C library and follows it closely.

      There are basically two use modes: (1) calling "Execute SQL" on a Connection to run SQL scripts (and optionally return 2D arrays of strings from an SQL statement that returns results); and (2) "Preparing" a single SQL statement and executing it step-by-step explicitly. The advantage of the later is the ability to "Bind" parameters to the statement, and get the column data back in the desired datatype. The "Bind" and "Get Column" VIs are set as properties of the "SQL Statement" object, for convenience in working with large numbers of them.

      See the original conversation on this here.
      Hosted on the NI LabVIEW Tools Network.
      JDP Science Tools group on NI.com.
      ***Requires VIPM 2017 or later for install.***
    • By Thang Nguyen
      Hi,
      I would like to know if I can write stream data into two different tables with SQLite at the same time or not? Currently my system has one camera. I store the capture images and particle measurement in one table. I have just add one more camera to the system and would like to do the same thing and write to a second table in a parallel process.  I wonder if this is possible or not.
      I use SQLite library.
      Thanks in advance.
       
    • By McQuillan
      Hi Everyone,
      I (re)watched James Powell's talk at GDevCon#2 about Application Design Around SQLite. I really like this idea as I have an application with lots of data (from serial devices and software configuration) that's all needed in several areas of the application (and external applications) and his talk was a 'light-bulb' moment where I thought I could have a centralized SQLite database that all the modules could access to select / update data.
      He said the database could be the 'model' in the model-view-controller design pattern because the database is very fast. So you can collect data in one actor and publish it directly to the DB, and have another actor read the data directly from the DB, with a benefit of having another application being able to view the data.
      Link to James' talk: https://www.youtube.com/watch?v=i4_l-UuWtPY&t=1241s)
       
      I created a basic proof of concept which launches N-processes to generate-data (publish to database) and others to act as a UI (read data from database and update configuration settings in the DB (like set-point)). However after launching a couple of processes I ran into  'Database is locked (error 5) ', and I realized 2 things, SQLite databases aren't magically able to have n-concurrent readers/writers , and I'm not using them right...(I hope).
      I've created a schematic (attached) to show what I did in the PoC (that was getting 'Database is locked (error 5)' errors).
      I'm a solo-developer (and SQLite first-timer*) and would really appreciate it if someone could look over the schematic and give me guidance on how it should be done. There's a lot more to the actual application, but I think once I understand the limitations of the DB I'll be able to work with it.
      *I've done SQL training courses.
      In the actual application, the UI and business logic are on two completely separate branches (I only connected them to a single actor for the PoC) 
      Some general questions / thoughts I had:
      Is the SQLite based application design something worth perusing / is it a sensible design choice? Instead of creating lots of tables (when I launch the actors) should I instead make separate databases? - to reduce the number of requests per DB? (I shouldn't think so... but worth asking) When generating data, I'm using UPDATE to change a single row in a table (current value), I'm then reading that single row in other areas of code. (Then if logging is needed, I create a trigger to copy the data to a separate table) Would it be better if I INSERT data and have the other modules read the max RowId for the current value and periodically delete rows? The more clones I had, the slower the UI seemed to update (should have been 10 times/second, but reduced to updating every 3 seconds). I was under the impression that you can do thousands of transactions per second, so I think I'm querying the DB inefficiently. The two main reasons why I like the database approach are:
      External applications will need to 'tap-into' the data, if they could get to it via an SQL query - that would be ideal. Data-logging is a big part of the application. Any advice you can give would be much appreciated.
      Cheers,
      Tom
      (I'm using quite a few reuse libraries so I can't easily share the code, however, if it would be beneficial, I could re-work the PoC to just use 'Core-LabVIEW' and James Powell's SQLite API)

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.