Jump to content

Gratch

Members
  • Posts

    1
  • Joined

  • Last visited

    Never

Everything posted by Gratch

  1. I have an application that allows a user to perform a large number of repetative tests as part of an experiment. Typically 1,000 to 5,000 tests are performed at a time, though recently a colleague performed 40,000 tests in a single experiment. Multiple sets of tests may be performed on one specimen Currently an ASCII summary data file is created for each experiment and the option exists for the user to save the data for each individual test, in ASCII again. A typical test may contain anywhere between 1,000 and 5,000+ data points for two channels, depending on test options selected. The two options that are currently top of my list are to use databases (prob mysql) or HDF5. HDF5 i have no experience of, other than the pdf file i read this afternoon mysql i have some experience of, but have concerns over the design of the database structure. A DB structure would probably have 1 table for calibration info, 1 table for test info and then I'm not decided on whether to have a) one table for testdata, which potentially could end up with hundreds of thousands of data entries, b) to create a new table for each experiment, or c)to create a new DB for each specimen, with new test data tables for each experiment. My initial guess for an HDF5 structure would be something akin to (b) but that is a guess, as at the moment I know very little about it. Am I on the right track with any of these ? Matt
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.