Search the Community
Showing results for tags 'binary'.
Edit: found this in context help: Arrays and strings in hierarchical data types such as clusters always include size information. http://forums.ni.com/t5/LabVIEW/Write-to-Binary-File-Cluster-Size/td-p/3199224 I have a cluster of data I am writing to a file (all different types of numerics, and some U8 arrays). I write the cluster to the binary file with prepend array size set to false. It seems, however, that there is some additional data included (probably so LabVIEW can unflatten back to a cluster). I have proven this by unbundling every element and type casting each one, then getting string lengths for the individual elements and summing them all. The result is the correct number of bytes. But, if I flatten the cluster to string and get that length, it's 48 bytes larger and matches the file size. Am I correct in assuming that LabVIEW is adding some additional metadata for unflattening the binary file back to a cluster, and is there any way to get LabVIEW to not do this?
Hi, I created a Binary Data file that stores cluster data. It is a cluster of 3 element.. 1. Unsigned byte - 8 bit integer 2. Double - 64bit real 3. Double - 64bit real This cluster of 3 elemetes is actually represented in the form of a waveform graph w.r.t time which I saved as a Binary File... (For the .dat file click here http://www.mediafire.com/?u4c1y9iho2b5qoe ) Now I want to read this binary file in labview and display it in the form of a graph. I cannot post the VI for creating the binary file because It requires Control system, Fuzzy toolbox but here are the parameters I used Byte Order - Little Endian See the attachment for my try for reading the binary file... Please help me with the reading of the .dat file I linked above If there is anything I have to tell, let me know Thanks read.vi