Jump to content
germ

Data structures and how to save them to disk

Recommended Posts

I have a motor controller that can control 8 different stages. Each motor's step size can be controlled with a parameter called step amplitude. The actual step size in units of length differs for each stage and for each direction. A calibration process exists to calibrate the step size for a given step amplitude. Each axis may be calibrated for different step amplitudes, but those amplitude values and their number may be different for each stage. The 8 stages are grouped into 4 groups of 2 channels.

What is an appropriate data structure to represent the calibration data in a Global VI and how can I save this to disk in a human-readable form?

At first, I assumed that all the axis would be calibrated for the same set of amplitudes. In this case, I can use arrays because all the calibration tables are the same size. I used a 4D array. The first two indices are group and channel, the third one is the calibration amplitude index, and the fourth is either the amplitude itself, the positive, or the negative calibrated step size. One problem with this is that there is no provision in LV to save a 4D array to a spreadsheet-like format in a text file, and I could not find an easy solution. Is there one?

So now I am thinking either of the following two possibilities:

1) Combine channel and group into a single index (0 to 7). Each stage has its own 2D array of variable size. It has three columns (step ampl., pos. and negative step size) and a variable number of rows. Because the arrays vary in size, I cannot use an "array of arrays", but I need to use a cluster. So I would have a cluster containing 8 2D arrays. I would then write a VI to save those 8 arrays one after the other using the built-in tools for converting 2D arrays to spreadsheet strings.

2) Same as above but just make the size of the amplitude steps larger than I would ever need to use (say, 10 steps). This way I have 8 2D arrays all of the same size and I can use an array instead of a cluster. The advantage of this is that I am dealing with arrays, so it's easier to write loops to handle the storage/retrieval, and it's also easier to show them and handle them in the Globals window.

Thoughts? TIA.

Share this post


Link to post
Share on other sites

OK - so this may or may not answer your question, but if all you really need is to serialize a data structure and write it to disk in a semi-human readable form, then you can use the Flatten to XML, Write to XML File. Then use Read From XML File and then Unflatten From XML to reload the data structure. This doesn't preserve a "spreadsheet" style look and feel. It's not clear to me if you have to create a file that you want to load into a spreadsheet and edit and then reload, or if you just want something a human could read to confirm that values are correct or such. You could also use some tools that serialize to .ini style files, like the ones from OpenG or MooreGoodIdeas (http://www.mooregood...iteAnything.htm)

As far as your data structure, if I read correctly, you could use an array of clusters that contain arrays - so index 0 - 7 would be an array of clusters, each cluster will contain a 2D array of variable size (and other stuff, if needed).

If you really need to export as a spreadsheet and then edit and import back into the data structure, it might be easier to use a spreadsheet and an automation interface (like Excel and ActiveX, although I think if you dig here someone contributed some tools for OpenOffice)

Mark

Share this post


Link to post
Share on other sites

Mark,

thanks a lot for your reply.

OK - so this may or may not answer your question, but if all you really need is to serialize a data structure and write it to disk in a semi-human readable form, then you can use the Flatten to XML, Write to XML File. Then use Read From XML File and then Unflatten From XML to reload the data structure. This doesn't preserve a "spreadsheet" style look and feel. It's not clear to me if you have to create a file that you want to load into a spreadsheet and edit and then reload, or if you just want something a human could read to confirm that values are correct or such.

I saw the XML Vis. I think this would work, but it is overkill. I only want the user to be able to take a peek at the configuration file, perhaps edit a couple of numbers by hand on occasion. A (small) 2D array is actually nicely human-readable in a text file.

You could also use some tools that serialize to .ini style files, like the ones from OpenG or MooreGoodIdeas (http://www.mooregood...iteAnything.htm)

Thanks for this pointer. I will take a close look at those VIs.

As far as your data structure, if I read correctly, you could use an array of clusters that contain arrays - so index 0 - 7 would be an array of clusters, each cluster will contain a 2D array of variable size (and other stuff, if needed).

This is useful as well. IIUC, by using an array of clusters, each containing a 2D array (instead of a cluster of 2D arrays) I gain an easy way to process the individual 2D arrays.

Share this post


Link to post
Share on other sites

What is the array size that would be guaranteed to house all of the data in its largest form? If you know the upper bounds of this, and it's not huge, that's almost certainly your best bet.

Depending on how versatile you want this to be in the future might determine how much work you want to put into the concept. Any method of flattening in this case is probably going to be roughly equal in that it's pretty easy to do, and will produce somewhat readable text, but I'm thinking that reading it back in and getting a consistent and correct result will be dicey once something changes in requirements. It sometimes turns out to be a personal preference, but in my case I've found creating a library of VI's for conversion to/from complex LV data structure to/from flattened and good CSV file saves more time in the end.

This should actually be linked into the recently posted question about interview questions one could use to probe an applicant ;). There's not really a singular right answer.

PS I'm wondering at a philisophical level about this statement: [One problem with this is that there is no provision in LV to save a 4D array to] Do other languages offer something in this regard? Off the top of my head, Python seems like the most likely candidate. It's interesting because it strikes directly into the land of architectural choices, assumptions, sacrifices, and possibly wrestling matches when there's bbq and alcohol involved.

Share this post


Link to post
Share on other sites

What is the array size that would be guaranteed to house all of the data in its largest form? If you know the upper bounds of this, and it's not huge, that's almost certainly your best bet.

Depending on how versatile you want this to be in the future might determine how much work you want to put into the concept. Any method of flattening in this case is probably going to be roughly equal in that it's pretty easy to do, and will produce somewhat readable text, but I'm thinking that reading it back in and getting a consistent and correct result will be dicey once something changes in requirements. It sometimes turns out to be a personal preference, but in my case I've found creating a library of VI's for conversion to/from complex LV data structure to/from flattened and good CSV file saves more time in the end.

This should actually be linked into the recently posted question about interview questions one could use to probe an applicant ;). There's not really a singular right answer.

PS I'm wondering at a philisophical level about this statement: [One problem with this is that there is no provision in LV to save a 4D array to] Do other languages offer something in this regard? Off the top of my head, Python seems like the most likely candidate. It's interesting because it strikes directly into the land of architectural choices, assumptions, sacrifices, and possibly wrestling matches when there's bbq and alcohol involved.

Moldyspaghetti reminds me of the mess you can get into if you have a lot of files that are just flattened data structures when the data sturctures changes between when they were saved adn when they are loaded. You get all wrapped up a pile of versions of the data structure and converters to read the old as the new. I'll just leave it as a warning for now.

Ben

Share this post


Link to post
Share on other sites

Moldyspaghetti reminds me of the mess you can get into if you have a lot of files that are just flattened data structures when the data sturctures changes between when they were saved adn when they are loaded. You get all wrapped up a pile of versions of the data structure and converters to read the old as the new. I'll just leave it as a warning for now.

Ben

This may be getting too far from the original question, but I seem to be on a roll for de-railing threads :rolleyes:

If you use the LabVIEW Schema XML Flatten and Unflatten VI's with LabVIEW classes, older versions of a flattened instance can be read into a newer class instance (I'm sure I'm not using the best terminology here). So if you have the data you want to serialize to file as the data members of a class and the class definition changes, the unflatten from XML still works reading the old class definition into the new. When the unflatten reads the old xml into the new data structure (the class private data), it just ignores any extra data members in the xml (this happens if you deleted or renamed a control in the class data definition) and places the class default data into any controls that exist in the new data definition that aren't found in the xml. If you try reading a cluster from an xml file created from an earlier definition, it just fails.

This can be both very handy and very dangerous if you're not aware of this behavior. I point this out here because it's one way to avoid creating the mess Ben warns about. And is another reason one might want to use classes instead of type-defs - wait - that was another thread :frusty:

Share this post


Link to post
Share on other sites

These replies remind me of something as well.

I absolutely loathe having the loading of settings from a file silently fail to update every relevant data structure. It's quite difficult to provide a full working revision system for system settings files, that can intelligently choose proper settings, let alone detecting a version #. My approach is usually just to have the system complain as loudly and obnoxiously as possible if it fails to load any parameter at all. If the customer is loading an old type of settings file, just moan like a cat that wants to come inside when it starts to rain. This might sound picky, but one can spend an hour or more tracking down an unexpected behavior from some parameter not being loaded, and going to default setting, that isn't what you expected ... and this becomes that much more brutal when you have assumed that it was correct, and is in fact a valid entry. There's nothing quite like wasting your time trying to figure out a bug in your software that doesn't in fact exist, but only appears like one because the proper settings weren't loaded.

Less pedantic this time, I like to check all assertion booleans coming out of the INI file readers (or equivalent) and give a nice juicy complaint if any fail to load. And, even though I know this is bad architecture form, this is the one time that I throw error dialogs from a low-level VI because I hate the alternative so much.

(Sorry to the original poster for the digression, but it's an interesting discussion ;) )

Share this post


Link to post
Share on other sites

These replies remind me of something as well.

I absolutely loathe having the loading of settings from a file silently fail to update every relevant data structure. It's quite difficult to provide a full working revision system for system settings files, that can intelligently choose proper settings, let alone detecting a version #. My approach is usually just to have the system complain as loudly and obnoxiously as possible if it fails to load any parameter at all. If the customer is loading an old type of settings file, just moan like a cat that wants to come inside when it starts to rain. This might sound picky, but one can spend an hour or more tracking down an unexpected behavior from some parameter not being loaded, and going to default setting, that isn't what you expected ... and this becomes that much more brutal when you have assumed that it was correct, and is in fact a valid entry. There's nothing quite like wasting your time trying to figure out a bug in your software that doesn't in fact exist, but only appears like one because the proper settings weren't loaded.

Less pedantic this time, I like to check all assertion booleans coming out of the INI file readers (or equivalent) and give a nice juicy complaint if any fail to load. And, even though I know this is bad architecture form, this is the one time that I throw error dialogs from a low-level VI because I hate the alternative so much.

(Sorry to the original poster for the digression, but it's an interesting discussion ;) )

I understand exactly where you're coming from and 99% of the time I agree - I want the app to start kicking and screaming if it can't load the proper configuration. But I have seen a couple of cases where my customers need to build large, custom configuration files. They then use the app for six months and realize they want to add some new functionality that logically gets configured in that file (that contains six months of user-defined data they will be royally p****d if they have to recreate). In these few cases, the auto-adapting behavior can be a good thing as long as you, the developer, know to expect it. But with great power, comes great responsibility :shifty:

Share this post


Link to post
Share on other sites

I was able to implement what I wanted using Mark's suggestion: An array of clusters, each cluster containing a 2D array. Because the arrays involved are small, this generates a nice, readable TSV file, which can be edited by a human if needed.

Because the arrays in the clusters are of variable size, I use a delimiter line ("[Ch1A1]"), etc. to mark the end of the array. Writing the settings file is straightforward using the "Convert 2D array to spreadsheet text" and "Write text to file" functions. For reading I wrote a small VI that parses the lines containing the array of clusters by looking for those delimiters.

As others pointed out, this works only if your settings/ini file is small. And any change to the global variables will necessitate a change to the VIs for saving and reading the settings file. For large, complex files, XML is probably a better solution.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.