John Lokanis Posted December 23, 2008 Report Share Posted December 23, 2008 I currently use .NET calls to read and write data to an SQL server. This works well most of the time, with rare errors from .NET that canot be explained. What I am looking for, however, is a faster way to accomplish the same tasks. LabVIEW calls to .NET have a lot of overhead. And even employing all the tricks from Brian Tyler's old blog still don't fix all the speed issues. What I am looking for is a different transport to get the data between my LabVIEW app and the server. So, has anyone else used a MS or 3rd part DLL or other interface to connect and execute SQL calls to a SQL server database? Can I bypass .NET somehow and access a lower level set of functions that are faster? I know ADO can be used (via ActiveX) but is there any reason to think it would be faster? thanks for you thoughts and experience on this issue. -John Quote Link to comment
Mark Smith Posted December 23, 2008 Report Share Posted December 23, 2008 John, It's been a while since I've used ADO.NET (and then I used it in a .NET application) but it performed seamlessly when connecting to a SQLServer database. Indeed, ADO.NET and SQLServer are both MS's latest technology and are behind a whole bunch of data-driven applications that seem to perform well. I don't know what the current development cycle is on ADO (the ActiveX one) but I doubt that it gets a whole lot of attention from MS these days - for example, this page http://support.microsoft.com/kb/183606 is almost two years old now and it looks like one of the latest MS support pages for ADO. But I think you seem to have narrowed down your problem to LabVIEW/.NET overhead, which I find reasonable given my experience. I don't think the .NET interface gets the attention from NI anymore that it used to now that Brian T's not there (I would be happy to be wrong about this!). But I have successfully solved similar problems by defining exactly what I expect .NET to do for me and using the most loosely coupled interface possible. This almost always requires writing a .NET DLL to reduce the calls to/from LabVIEW. In this case, I don't think I would access the ADO.NET namespace from LabVIEW, what I would do is just use some structure to pass the data to/from a method defined by my DLL that then uses all of the .NET magic to write to and read from the database and reduce the LabVIEW/.NET interaction to an absolute minimum. I also used (years ago!) Jeffrey Travis' LabSQL and it worked as advertised - they are wrappers around the ADO ActiveX objects so this would save you a lot of time since he's already exposed LabVIEW interfaces. You could create some quick and dirty tests and maybe decide if this would increase the performance of your app. Lastly, there's NI's Database Connectivity Toolkit. I don't have any experience with it or have any idea what's under the hood. Mark Quote Link to comment
jdunham Posted December 23, 2008 Report Share Posted December 23, 2008 Which ADO functions are you using? Last time we used the NI toolkit it was extremely slow, but we have our own wrappers to ADO and our stuff is working well enough. The main thing is to use the ADO.GetRows method. I'm on a deadline today, but I'll try to see if I can get more details on how to make it fast. Jason Quote Link to comment
alukindo Posted December 24, 2008 Report Share Posted December 24, 2008 Hi John: (1) When you say that you use .NET for connectivity, what are you actually referring to? is it ADO.NET or is it some other driver library? (2) Performance measures are relative to some type of bench-mark. So what are your performance metrics as of now? How many records are you able to save per second? How many can you download per second? You may have already reached the universal performance limit for database access if your data upload rate is ~300 rows per second, and your down load rate is ~20,000 rows per second. However, reaching these ideal performance measures depends on other factors as well: (a) In any method of database access, the drivers that you use expose different programming methods to access data. Some of those methods have server-side cursors, some of them lock database rows, some of them are called 'fire hose' forward only methods etc. Basically, each of these methods comes with different overhead that impacts your application's performance. Can you shine some light on your high-level programming steps? (b) How is your database designed? Is it relational? Does it have indexed primary keys? Are you using table joins? Are you using stored procedures with nested queries? These factors affect how fast you can save or re-call data. E.g: Table column indexing can speed data access by a factor of 10 to 20 times! i.e a query that used to run for 30 seconds can be improved to take less than 3 seconds. © If you are using ADO.NET and are trying to compare this with ADO, then for LabVIEW you are better off using ADO -- just because the 'GET ROWS' method was removed from ADO.NET. This is the one critical method that made it extremely fast to move 'retrieved data' from the driver into a LabVIEW 2D array of strings. Using ADO.NET you will have to iterate row-by-row to move data into LabVIEW which will be much slower. Programmers using Visual Studio's ADO.NET with Win Forms have a way to quickly 'bind' data to data grids using the 'DataSet and DataSource' properties of ADO.NET which performs as fast as the 'Get Rows' method. As far as I am aware, LabVIEW, cannot handle these data types natively making ADO.NET suffer a huge performance penalty under LabVIEW. (unless of course you use .NET data grids in LabVIEW that can 'bind' to DataSets). ... in any case, there are a number of unknowns regarding your implemented solution to know for sure how you could improve your LabVIEW app's performance. Anthony Quote Link to comment
Rolf Kalbermatter Posted December 24, 2008 Report Share Posted December 24, 2008 I have personally only ever used my own ODBC based library. I can't really comment if it is fast as I have no comparison but it was never slow for what I needed it for. I have implemented some specific features in that library to fetch one or more rows at one time to reduce the network roundtrips the ODBC driver has to do for a query. There is actually still room for improvement in that part but the entire interface is written in C and this part is something that gets very messy very fast once you consider to support not just one specific ODBC driver but want to be compatible with as many as possible. If you are doing lots of updates however this library wouldn't help much as these have still to be executed as SQL statements since I haven't found time to add support for parametrized SQL updates and the priority for this is way low on the list. The fastest way to access an SQL server is quite likely through its native API. There is a DLL interface to the Microsoft SQL Server API that does avoid any issues with intermediate standardization software layers such as ODBC, ADO. DAO, ADO-.Net. I have no idea how easy if would be to interface to that API using the Call Library Node but suspect that there would be some intermediate code required or at least very helpful, to translate between the SQL Server API and a more friendly LabVIEW API. The disadvantage of this approach is that you will be tied to SQL Server for the rest of your life if you use its own proprietary API. Some remarks to some other things mentioned so far: LabSQL, the Database Toolkit and just about any other LabVIEW database support library out there does use the ADO ActiveX interface. They may implement certain things differently such as using the mentioned GetRows method or iterating through the rows instead, support certain operations or not but the basics of all them is in fact ADO ActiveX. As to the missing of the GetRows operator in .Net. I think John is using a technique he got from the mentioned blogs from Brian Tyler where he does the actual extraction in an external .Net component written in VB or VC .Net to avoid exactly the huge overhead of having to iterate through the entire rowset from LabVIEW which due to its dynamic interface to .Net has a certain overhead for each invocation of a .Net method. Rolf Kalbermatter Quote Link to comment
John Lokanis Posted December 31, 2008 Author Report Share Posted December 31, 2008 Sorry for not replying sooner. Xmas vacation and all... Yes, I use the trick Brian posted on his blog that allows block transfer of data from the .NET space to the LabVIEW space. This avoids the 'GetRows' issue but required a very small .NET DLL to be written. Yes, the NI toolkit is very slow, due to their use of the ADO.NET interface and the aforementioned iteration issue. Most of my communication is via SQL API calls (Stored Procedures). These SPs agregate many SQL statements and other logic on the server to provide the data I request (or insert the data I provide). We have a whole department writting these SP and setting up the SQL server database, with all the indexes, relational design constructs, keys, etc... I try to leave them alone as long as I have a SP for everything I need to do. I do not use ADO.NET, I use SQL.NET. This is like a special version of ADO.NET just for SQL server. It is part of the 2.0 Framework. For this application, I am only interfacing to MS SQL Server and likely this will never change. I keep the connection open all the time until it times out or is dropped due to some issue. In that case, I close it, log the error, wait, reopen and then try again. I have ~30 open connections on each machine (multi threaded app) and I have 14 machines running all the time. So, I have a potential max of ~420 active calls to the database at any given time. Usually this is much lower since all the threads are async and not reading/writting at the same time. I guess my concern is still in the overhead of the LabVIEW -> .NET interface. As was stated, this appears to have been ignored since Brian's departure. I am not up to the task of writting my own C interface at this time. So, it sounds like the only option is to try out the LabSQL interface and see if it is faster that what we have now. If not, then I will have to live with my current performance. thanks for all the ideas. -John Quote Link to comment
xtal Posted January 13, 2009 Report Share Posted January 13, 2009 QUOTE (jlokanis @ Dec 30 2008, 03:09 PM) Yes, the NI toolkit is very slow, due to their use of the ADO.NET interface and the aforementioned iteration issue. Hey, John I just wanted to make a quick clarification. The NI Database Toolkit uses ADO, the ActiveX version and not .NET. It does have VIs to do the faster GetRows calls mentioned by others. ADO is what LabSQL uses, too, so I doubt it would be any faster than the NI Database Toolkit. However, I believe ADO has less overhead than ADO.NET or any of the .NET interfaces. I'd be interested to know for sure if that's the case. I've been using the .NET interface quite a bit lately with SQL Server, but we are much more interested in data security than speed, so I've not done any benchmarking. And, like you, we use lots and lots of custom SPs. Good luck and please let us know how this turns out. Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.