wallage Posted April 20, 2010 Report Share Posted April 20, 2010 I have trouble working with the huge amount of measured data in my application. I have made a small example of what I am doing to illustrate the problem. for a little background info this is what I'm (trying) doing: I want to measure 21 analog channels with a sample freq. of 10KHz and 1Ksamples/channel over 10k to 25k different situations. This will result in 21*1k*25k= 525M measurements (I know it is a lot) at max. All this data must be saved to disk and processed later. The measured data must be of type waveform because later on I want to do FFT calculation on it and therefore I also need the sampling data. As you might already guess I'm getting a memory full error. I have tried saving the measured data to a binary file which results in a "memory full" error. Then I tried to save it to a TDSM file which doesn't result in a "memory full" error but saving takes longer than doing the measurement over again. Can someone help me how to save and process this huge amount of data? Many thanks in advance, arjen memory full.vi Quote Link to comment
Tim_S Posted April 20, 2010 Report Share Posted April 20, 2010 I want to measure 21 analog channels with a sample freq. of 10KHz and 1Ksamples/channel over 10k to 25k different situations. This will result in 21*1k*25k= 525M measurements (I know it is a lot) at max. All this data must be saved to disk and processed later. I'm not understanding this. You have 21 channels. They are sampled at 10 kHz maximum rate. What is the 10k to 25k situations? You're going to run out of memory with the example code. I'm not clean on what you're doing that would generate so many long waveforms in one file. Quote Link to comment
kristos_b Posted April 20, 2010 Report Share Posted April 20, 2010 I have trouble working with the huge amount of measured data in my application. I have made a small example of what I am doing to illustrate the problem. for a little background info this is what I'm (trying) doing: I want to measure 21 analog channels with a sample freq. of 10KHz and 1Ksamples/channel over 10k to 25k different situations. This will result in 21*1k*25k= 525M measurements (I know it is a lot) at max. All this data must be saved to disk and processed later. The measured data must be of type waveform because later on I want to do FFT calculation on it and therefore I also need the sampling data. As you might already guess I'm getting a memory full error. I have tried saving the measured data to a binary file which results in a "memory full" error. Then I tried to save it to a TDSM file which doesn't result in a "memory full" error but saving takes longer than doing the measurement over again. Can someone help me how to save and process this huge amount of data? Many thanks in advance, arjen You could save every measurement waveform to separate file. To avoid having thousands of files in one directory you could use openg zip library and archive them on the fly. Works for me but I never had more than thousand measurements. I have no experience with TDMS so I dont know much about errors you are getting. rgds, K Quote Link to comment
wallage Posted April 20, 2010 Author Report Share Posted April 20, 2010 I'm not understanding this. You have 21 channels. They are sampled at 10 kHz maximum rate. What is the 10k to 25k situations? You're going to run out of memory with the example code. I'm not clean on what you're doing that would generate so many long waveforms in one file. Hi s_tim, I have a LED driver with 16bit intensity setting (65535 possible dimming steps). I want to do light measurements over at least 10k out of 65k setpoints. Quote Link to comment
Tim_S Posted April 20, 2010 Report Share Posted April 20, 2010 kristos_b has the right of it.... you'll need to break up your data into smaller files and avoid loading them all into memory. Windows slows down and will choke once a certain number of files are in a directory (sorry, I don't know the number off the top of my head and I expect it's different with Win7 and 64 bit). Zip files and subdirectories are ways to avoid hitting this limit. I don't know your design spec, but saving all that data raises a red flag for me; you may want to go back and evaluate what you really need to save and what your requirements are. Tim Quote Link to comment
JohnRH Posted April 20, 2010 Report Share Posted April 20, 2010 Depending on your time/budget constrains it may be sufficient to max out your computers memory (4GB), and eliminate unnecessary programs from running. (How much memory are you running right now?) Either that, or upgrade to a 64bit machine with 8GB or more of memory (you would have to use a 64bit version of LabVIEW). However, there really should be a way to overcome your problems without throwing hardware at it. Quote Link to comment
wallage Posted April 21, 2010 Author Report Share Posted April 21, 2010 Depending on your time/budget constrains it may be sufficient to max out your computers memory (4GB), and eliminate unnecessary programs from running. (How much memory are you running right now?) Either that, or upgrade to a 64bit machine with 8GB or more of memory (you would have to use a 64bit version of LabVIEW). However, there really should be a way to overcome your problems without throwing hardware at it. I have 2GB of physical memory plus 2GB swap memory. Quote Link to comment
FrankB Posted April 21, 2010 Report Share Posted April 21, 2010 Hi wallage, ... This will result in 21*1k*25k= 525M measurements (I know it is a lot) at max. ... If you use a waveform with data type DBL you will end up in more than 4GBytes of raw data. You may use SGL to reduce the needed disk space by at least factor 2. ... I have tried saving the measured data to a binary file which results in a "memory full" error. ... In your example you collect all data in memory before writing it at once to the binary file. Solution: Open the binary file and write each measurement - 21 waveforms with their 1000 samples - directly to the binary file and close the binary file after finishing all data acquisitions. Similar to your TDMS write example. On my machine, which is not the newest one, less than 70 seconds are neccessary to store the 1,6GB binary file (~23MB/s). Haven't tested the performance of the TDMS stuff for a while - my test program for TDMS is broken at the moment and i am to lazy/busy atm to fix it :-) I prefer the binary write for this simple things anyway. Regards, Frank Quote Link to comment
wallage Posted April 21, 2010 Author Report Share Posted April 21, 2010 (edited) Hi all, Many thanks for all your input so far. I have simplified the "write binary" case a little bit and still I get a memory full error. I don't understand it because in the current situation the VI only uses 160MByte. 2D array of DBL's (64bit) size 10x1000 = 640Kbit This multiplied by 2000 for the 3D array after the for loop makes: 1280Mbit or 160Mbyte. I can't imagine that this will already generate a memory full error. Could there be another reason? memory full.vi Edited April 21, 2010 by wallage Quote Link to comment
FrankB Posted April 21, 2010 Report Share Posted April 21, 2010 ... I have simplified the "write binary" case a little bit and still I get a memory full error. ... This happens on my machine too. Off course you have more memory available than just 160MB and I don't know the correct reason why LV claims so early that the memory is full. Afaik the amount of memory you can use in your VI is depending on your hardware (physical memory and so on) but I'm not sure how it is calculated. But I still wonder why you want to collect this huge amount of data in memory. Your single measurement shots are small enough to have no negative effects to the memory usage. Simply stream the data to a file after each light measurement of a single setpoint. Or are there any needs that you have to collect all the measurements in raw value in your memory? Regards, Frank Quote Link to comment
wallage Posted April 21, 2010 Author Report Share Posted April 21, 2010 This happens on my machine too. Off course you have more memory available than just 160MB and I don't know the correct reason why LV claims so early that the memory is full. Afaik the amount of memory you can use in your VI is depending on your hardware (physical memory and so on) but I'm not sure how it is calculated. But I still wonder why you want to collect this huge amount of data in memory. Your single measurement shots are small enough to have no negative effects to the memory usage. Simply stream the data to a file after each light measurement of a single setpoint. Or are there any needs that you have to collect all the measurements in raw value in your memory? Regards, Frank Hi Frank, After having acquired all measurements I need to make a 3d plot with: x-axis = dim setpoint y-axis = frequency z=axis = FFTvalue to make this plot I need all the data (right?). I can calculate the FFT plot for every "dim setpoint" separately but in the end I still have all data in memory because I need it for my 3d plot. Quote Link to comment
Grampa_of_Oliva_n_Eden Posted April 21, 2010 Report Share Posted April 21, 2010 Hi Frank, After having acquired all measurements I need to make a 3d plot with: x-axis = dim setpoint y-axis = frequency z=axis = FFTvalue to make this plot I need all the data (right?). I can calculate the FFT plot for every "dim setpoint" separately but in the end I still have all data in memory because I need it for my 3d plot. LV wants contiguous memory to store arrays so if you don't have a single contiguous block bige enough "memory full". I have avoided this using a number of different techniques that depned on the "data path" but recently I tried out an idea from Aristos Queue to use multiple queues to store the data. I asked the developer I was assisting to create a new queue for each record read from file and to prcoess each sepeartely. A couple of days latter he mentioned that it worked great. AS to the 3d graph... How many points are you planning to plot? Keep in mind the screen has a limit and after you map 1000 points to a single pixel you aren't going to see any more than the screen can display. Ben Quote Link to comment
FrankB Posted April 21, 2010 Report Share Posted April 21, 2010 (edited) ... a 3d plot with: x-axis = dim setpoint y-axis = frequency z=axis = FFTvalue to make this plot I need all the data (right?). ... I see. Just another thought: You mentioned, that you are getting data from 21 different channels. Is it neccessary to have the data off all channels (and measurement) within a single 3d plot? Or is it ok to have only the FFTs of one channel in one 3d plot and then have the ability of scrolling through the different channels. Maybe have an additional 3d plot for a second channel for data comparison? In that case you can reduce the needed memory by factor 21 or at least 10.5. And scrolling through the channels should be not too boring. The FFTs are calculated real fast. 20k FFT with each having 1024 data points takes less than 3 sec on my machine. Consider to use 2^n data points to use the FFT algorithm instead of DFT. That should be faster if I am correct. (Edit: ok, you need some extra time for loading the raw data for the selected channels. Maybe you can organize the data files per channel to get faster or easier access to the channel data.) Regards, Frank Edited April 21, 2010 by FrankB Quote Link to comment
wallage Posted April 21, 2010 Author Report Share Posted April 21, 2010 I see. Just another thought: You mentioned, that you are getting data from 21 different channels. Is it neccessary to have the data off all channels (and measurement) within a single 3d plot? Or is it ok to have only the FFTs of one channel in one 3d plot and then have the ability of scrolling through the different channels. Maybe have an additional 3d plot for a second channel for data comparison? In that case you can reduce the needed memory by factor 21 or at least 10.5. And scrolling through the channels should be not too boring. The FFTs are calculated real fast. 20k FFT with each having 1024 data points takes less than 3 sec on my machine. Consider to use 2^n data points to use the FFT algorithm instead of DFT. That should be faster if I am correct. (Edit: ok, you need some extra time for loading the raw data for the selected channels. Maybe you can organize the data files per channel to get faster or easier access to the channel data.) Regards, Frank I have looked some more at the data that I want to process and I found that I only need the raw data of 1 analog channel for the FFT and that would be the one from the light intensity sensor. The other channels measure current and voltage and I can "mean" the 1000 sample per channel resulting in a dramatic decrease in data. I still have to work everything out but I think that I'll be able to work around the memory issue this way. Thanks for all the help so far. Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.