dblk22vball Posted October 9, 2009 Report Share Posted October 9, 2009 I am working on an application that will run a series of tests on a part (8 tests right now), and manufacturing/quality would (in an ideal world) like all of the data to be searchable with weekly reporting. In the past, I have just used a text or datalog file to save the results, as I had only 10 values per UUT, but now with this application I will need to save at a minimum 60 results per UUT (if I only save the identification information and the "evaluated" vales (7 sensors, 8 tests)). We will also be testing thousands per year, with maybe 100 of them being tested per day, so my text/datalog file would get huge very rapidly. I am currently saving my information to one TDMS file, with the group name being the timestamp of the start of the test to differentiate each UUT (we are working on serial numbers, since there will be multiple fixtures in the future). This works ok, but I am not sure if this is the best solution for a searchable, data mining application WITHOUT USING DIADEM. I am not opposed to Diadem (I actually think it is great), but since there will be 10+ people who would like to see the data, the company does not want to shell out hundreds of dollars per computer for them to have access to diadem. And I do not feel like being the "search person" who has to get the data for everyone. I though of going with an SQL database, but this seemed very cumbersome saving large amounts of data, aka streaming a test. I am also a novice at SQL and IT does not use it often, and do not want me to have access to their server, so I would have to maintain and control my own server. We use the DSC system on another application, so we do have that option, but again, it is another $800 run-time license for each test station we have. Can I get your 2 cents or dollar on which one you think might be the best overall solution. Quote Link to comment
asbo Posted October 9, 2009 Report Share Posted October 9, 2009 Can I get your 2 cents or dollar on which one you think might be the best overall solution. Based on your description of how much data there's going to be and the interface requirements, I would agree that SQL would be a good solution - if you think it won't suck up too much of your time getting aquainted. Not that I've any experience with it in LabVIEW, but you could have a backend set up for this in less than an hour. I'd use MySQL (unless you find that another flavor is more suite to your task) and throw up a package like phpMyAdmin on a basic Apache install - there are guides everywhere for Apache+PHP installs. If you don't want people to be able to modify the database, then you can create a credential for just viewing the data. phpMyAdmin is designed for use by people who know SQL, but I (believe) the interface does allow more basic use. It's worth checking out if you want more of a turn-key interface. Like I said, I've yet to actually use SQL in LabVIEW, so hopefully someone else can chime in on how troublesome that will be. I think more than anything it's going to come down to just how much motivation you have to not shell out for licenses. Quote Link to comment
ShaunR Posted October 9, 2009 Report Share Posted October 9, 2009 (edited) I had to make this dicision a while ago...and never looked back. I set up my own "Test" network which I got IT to link to so that anyone with a browser and login on thier network could view test results and other test information. This is still in use and is incredibaly scaleable. I set up an old PC with a webserver using Xampp. Its a fully functioning webserver including Apache, PHP, CGI, SQL,mail SSL and well, everything. But more importantly it is really easy to install and FREE. About 10 minutes and your up and running. Part of the Xampp installation is an SQL server. I set up a set of "Test" databases (we had 3 product lines at the time so I decided to have a different one for each...personal choice to partition the data better. You could have one or you could have one for each station..its up to you how you set it up). The test stations (18 of them, 6 per line) all pushed thier test data into the various databases using the Labview SQL toolkit. I just made a drop in module for the existing test stations so that instead of writing the results to a file, they wrote it to the database. I then wrote a few webpages (started as two, is now about 80 and quite sophisticated with graphs, process capabilities, searchable queries etc) that queried the databases so that any user could view the results in a web browser. The thing about this setup was that it operated perfectly fine on its own as a standalone system, but once it was up and running, IT really took an interest (especially after the big guns wanted to see the pretty graphs on their laptops in the office ) and they have now taken the responsibility of managing and updating the webserver. From my point of view (and ITs) it creates a defineable partition between the test network and thiers and it is a technology they understand. The big guns loved it because there was no captital cost (well, apart from the SQL) although I did persuade them that they should make a donation. Edited October 9, 2009 by ShaunR 1 Quote Link to comment
dblk22vball Posted October 9, 2009 Author Report Share Posted October 9, 2009 Thank you to both. My question with SQL, is how did you push all your data to the database, ie in my case 60 data points on each test. From my reading, that is not a way to easily send an array of data to the database. My basic data that I am putting in the database for each UUT is: Fixture #, Employee #, Result, Serial # (future addon, but planning for it), and the data (min of 56 points). I do like the webpage idea, but I would have to get IT involved in that, I am a HTML idiot. Thanks guys. Quote Link to comment
asbo Posted October 9, 2009 Report Share Posted October 9, 2009 My question with SQL, is how did you push all your data to the database, ie in my case 60 data points on each test. From my reading, that is not a way to easily send an array of data to the database. I see two ways to approach that: 1) Insert the data to the database as you read those points (maybe write the data to a buffer which commits it asynchronously at a lower priority than your test in the background?) 2) Write a wrapper which expects to have an array of data points and have it loop through the array, committing each point. There are ways to save arrays to a database (serialization comes to mind) but I have never found a good reason to do so. It adds a lot of overhead and there is always a better approach to storing the information. I do like the webpage idea, but I would have to get IT involved in that, I am a HTML idiot. Like I mentioned, you could use phpMyAdmin, but that necessitates the people who are using it to be fairly intuitive and competent. Definitely not the solution to use if big wigs are going to try and review the data. Quote Link to comment
ShaunR Posted October 9, 2009 Report Share Posted October 9, 2009 Thank you to both. My question with SQL, is how did you push all your data to the database, ie in my case 60 data points on each test. From my reading, that is not a way to easily send an array of data to the database. My basic data that I am putting in the database for each UUT is: Fixture #, Employee #, Result, Serial # (future addon, but planning for it), and the data (min of 56 points). I do like the webpage idea, but I would have to get IT involved in that, I am a HTML idiot. Thanks guys. SQL is all string manipulation e.g INSERT INTO TestDB (Fixture, Employee, Serial) VALUES ('No1', 'Brown','12345'). Will insert a new entry and populate the "fields" Fixture, Employee and Serial number with the values No1, Brown and 12345. (See its not that hard ) For the data, I would use a TEXT field (65,536 chars max should be plenty for most apps) and format my array to colon delimited floating point values. This makes it much easier to get back out and much, much easier to debug. e.g INSERT INTO TestDB (RawData) VALUES ('10.123456,3.1212134,12.11223344') If any calculations are to be made on the data (eg mean values, standard deviation, noise, max-min etc) you can iether post process it in the report query (harder) or save it to the DB as a separate field (easy). Quote Link to comment
dblk22vball Posted October 9, 2009 Author Report Share Posted October 9, 2009 (edited) I am with you so far For the comma seperated part, everything would be in one giant one line string, correct. so it would be: 1,2,32,4,5 and then you take another data point and it becomes, 1,2,32,4,5,6,7,8. so then when I want to display it in a table in labview (or on a graph), I would have to parse the string and every 4 th value is in the same column, ie 1 and 5, 2 and 6. Edited October 9, 2009 by dblk22vball Quote Link to comment
Chris Davis Posted October 9, 2009 Report Share Posted October 9, 2009 You might want to check out the w3 schools tutorial on SQL. They also have one on HTML, if interested. You may also want to study up on Relational databases (sorry no tutorial link here), square tables are just the most basic use of a database. Quote Link to comment
dblk22vball Posted October 9, 2009 Author Report Share Posted October 9, 2009 nevermind, just use the array to spreadhseet string vi.... Quote Link to comment
ShaunR Posted October 9, 2009 Report Share Posted October 9, 2009 You might want to check out the w3 schools tutorial on SQL. They also have one on HTML, if interested. You may also want to study up on Relational databases (sorry no tutorial link here), square tables are just the most basic use of a database. Indeed. But for an introduction to DBs and SQL it will be perfectly adequate and acheive what is required. You gotta open the door before running through it Quote Link to comment
jzoller Posted October 9, 2009 Report Share Posted October 9, 2009 Problems to solve in advance: What are you going to do when the database is down? What are you going to do if the test station doesn't have a network connection? How are you going to back up (and restore, if necessary) the database? Don't get too caught up in the SQL side before you work these out. I know from personal experience that it's painful. Joe Z. Quote Link to comment
asbo Posted October 9, 2009 Report Share Posted October 9, 2009 For the data, I would use a TEXT field (65,536 chars max should be plenty for most apps) and format my array to colon delimited floating point values. This makes it much easier to get back out and much, much easier to debug. This is a horrible abuse of SQL datatypes. If there's four values per point, you should have four columns for data per record, one record per point. Tie all the points in a set with an additional column, it could be as simple as a TIMESTAMP. Caveat: you should pick a timestamp before inserting any records and then use that pre-generated timestamp in each record; otherwise, the default behavior for a TIMESTAMP field is use the time of insertion). As for ease of debugging, just "SELECT * FROM `table` WHERE `timestamp`=(the data set you're interested in)". If you clarify your data structure a bit more, I could give you a better recommendation. Quote Link to comment
jzoller Posted October 9, 2009 Report Share Posted October 9, 2009 This is a horrible abuse of SQL datatypes. Agreed. Because of how databases work, it destroys your query efficiency. --- I got a bit rushed on my earlier answer, so... here's my personal database rules of thumb. Maybe they'll be helpful: Write data twice. Once as simple raw text to the local hard drive, and once to the database. Each test has a timestamp or ID that uniquely identifies it. Split text files by day or week so they don't get too large. Every so often, the database compares its records to those on my tester(s), and match up any that were missing. The database schema should not exceed two tables on the first pass design. Make it complicated sometime later. For specific tools, I would probably use MySQL (with plans for an Apache web server) for a low cost version, and SQLServer (with plans for an IIS web server) if I could afford it. Joe Z. Quote Link to comment
jcarmody Posted October 9, 2009 Report Share Posted October 9, 2009 My question with SQL, is how did you push all your data to the database I tend to dislike TestStand, but it's UNBEATABLE when it comes to putting your data into a database. I did it once and it was so easy (almost trivial). Quote Link to comment
Shaun Hayward Posted October 9, 2009 Report Share Posted October 9, 2009 SQLServer (with plans for an IIS web server) if I could afford it. One thing to bear in mind is that SQL Server now comes in a free "Express" version. It offers some limitations such as a maximum database size of 4Gbs, but in our test systems, I have never hit the limits. Whats also nice is that if those limits are hit in the future, it is seemlessly compatible with the other not-so-free versions of SQL Server, meaning that you can easily upgrade the back end and your test system software will not even know the difference. Shaun Quote Link to comment
ShaunR Posted October 9, 2009 Report Share Posted October 9, 2009 This is a horrible abuse of SQL datatypes. If there's four values per point, you should have four columns for data per record, one record per point. Tie all the points in a set with an additional column, it could be as simple as a TIMESTAMP. Caveat: you should pick a timestamp before inserting any records and then use that pre-generated timestamp in each record; otherwise, the default behavior for a TIMESTAMP field is use the time of insertion). As for ease of debugging, just "SELECT * FROM `table` WHERE `timestamp`=(the data set you're interested in)". If you clarify your data structure a bit more, I could give you a better recommendation. It is. But it keeps it simple, easy to understand and trivial to migrate to the "proper" way (using stored procs) once a little more experience is gained. PostgreSQL has variable length array datatypes (and then you could use the correct datatype) but MySQL doesn't. Problems to solve in advance: What are you going to do when the database is down? What are you going to do if the test station doesn't have a network connection? How are you going to back up (and restore, if necessary) the database? Don't get too caught up in the SQL side before you work these out. I know from personal experience that it's painful. Joe Z. These are very, very important considierations. Quote Link to comment
dblk22vball Posted October 9, 2009 Author Report Share Posted October 9, 2009 (edited) Thank you all for your replies. As far as my datastructure, here is a sample of what I am saving in a text document currently. I am also saving this to a TDMS file as well (So to answer the write twice comment, yes I am and plan too). This is from my text doc, tab delimited. Currently it has the following data: Timestamp Pass/Fail Reason (for failure) Employee # and then all of the sensor data (I only copied a part of it, otherwise the post would get huge for no reason): Cold PSI Mixed PSI Hot PSI Flow Rate Cold Temp Mixed Temp Hot Temp 10/8/2009 3:13:16 PM Pass Pass 105 45. 2.6 44. 3.9 52. 104 145 156 155 154 0.1 51. 104 154 44. 2.7 44. 4.0 48. 107 159 44. 2.6 44. 4.1 44. 104 167 49. 0.3 6.0 0.8 42. 96. 161 44. 2.7 44. 3.9 41. 107 161 5.3 0.6 49. 0.4 40. 110 167 45. 2.8 45. 3.6 40. 103 161 10/9/2009 8:12:53 AM Timed Out Failed Cold Seat Return 105 45. 2.8 45. 4.1 64. 106 145 161 159 155 0.1 57. 107 153 45. 2.6 45. 4.1 47. 107 159 45. 2.6 45. 4.1 41. 106 167 47. 0.0 6.9 0.8 39. 76. 155 47. 1.2 25. 3.0 37. 103 152 As you can see, the first UUT passed, whereas the second one did not, so the second test did not have as much data (fixture stops the test if the current test fails). And I have been doing this in the free express version of SQL Edited October 9, 2009 by dblk22vball Quote Link to comment
Ton Plomp Posted October 11, 2009 Report Share Posted October 11, 2009 Another tip, when you are using mySQL (or any other server) on Linux, Webminis a great/superb tool to manage the server. For mySQL you can create backups and restore them, view data or table definitions. Ton Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.