Jump to content

TDMS to CSV memory issues


Recommended Posts

I am having an issue with memory when reading a TDMS files and converting to an array to write to a spreadsheet file. If I look in task manager when I run the VI the first time it allocates a bunch of memory and only a small fraction of that memory is freed after the VI is run. Even closing the VI and project doesn't free the memory, I have to exit out of LabVIEW before the memory is freed.

Basically what happens is I open the VI and in task manager I see the memory increase (duh). Then I run the VI and it increases greatly (up to around 400,000 K), but when it finishes running only about 10 k is freed. Then every subsequent time I run, there is only about 10 k allocated then freed. So, the memory stops increasing, but it still holds on to a big chunk. The reason I worry this could be an issue is that I have many of these subVIs running one after the other, and if each one is hanging onto 300,000 K even after it's done running, I will begin to have memory issues. I am wondering if part of it could be that I'm in the dev environment, so I may build an exe to check. All my SubVIs do not have the panels opened, so I don't believe I should have a big array indicator showing data and taking up all that memory. Any thoughts or suggestions are appreciated.

Edit: Is my attachment there? My internet is going super slow so I'm not sure...

Edited by for(imstuck)
Link to comment

Nope no attachment. If you search NI's forum for TDMS and memory issues you'll find alot of posts. I'd suggest going through some of those if you haven't already. Other than that I'd say try to get your issue as reproducible as possible and send it off to NI to see if they can shed any light on the issue.

Link to comment

Remember that LV does its own memory management under the hood. In general, it is lazy about freeing large chunks of memory it has already allocated from the system. Unless the subVIs run simultaneously, I would expect the memory block to be re-used amongst them, unless there is a significant amount of data that is being passed out which would justify to keeping the allocation. In that case, it just means you don't have enough memory for what you're trying to do (or alternatively that there may be a more efficient way). There is a "Request Deallocation" node you can try, but I don't tend to put much stock in it. Said another way, I trust LV to handle its own memory.

Based on your phrasing, though, this seems like a premature optimization - write a test case you think might cause out-of-memory issues and see if it actually happens.

Link to comment

Well, I ran into some other memory management issues in the mean time (i.e. tried to read all of the sections I cared about and put them in a 2D array then write that array at once...fail, not enough memory to make that large a 2D array). So I'm going to have to write in chunks, but that's on a different topic.

I was going to try adding my attachment again, but it's larger than the 10MB I have available due to the TDMS file. I'll try to recreate another one later that's smaller so I can put it up. You may be correct, and that it will just hold this memory and reuse it. I am going to try to get everything working and see if it's an issue (have to fix what I mentioned in the first paragraph), It still worries me that closing the VI doesn't free the memory. Yet, if you make a VI, just allocate a 100 million element array of doubles then close the VI, the memory in that situation is freed when the VI is closed. In my VI using TDMS, there are still 400,000 kb that are not freed until LabVIEW is actually exited. I have worked with an AE a bit on this who suggested using the request deallocation, which I found doesn't work in this situation.

Link to comment

The problem with comparing your two memory scenarios is that we really have no idea how it handles each. It's easy to theorize what it should be doing and convince ourselves what it's probably doing, but that's not always the case. Usually, it does make more sense to just test it out and see what happens.

One consideration is that the memory might be held by whatever library is used for TDMS. I assume you're flushing/closing your file correctly, but the reason it persists after closing the specific VI is that the memory is not used by a LabVIEW data structure, but one from the TDMS library.

As for your attachment, I've had very good luck in the past compressing TDMS files, but I don't remember if this board allows files with a .zip extension. It might be more advantageous to see your test case, anyway.

Link to comment

I may try posting it to the darkside if it will fit, and attach a link. There's probably some Rube Goldberg code in there and some things that could be improved, but I will go back and optimize if need be, so feel free to offer suggestions. I should also add the "other issues" I had were in different VIs, not the one I have posted. This one shows the issue with the memory freeing.

Edit: couldn't get it to fit, but here is a drop box link. Sorry if this method isn't "kosher" but I was tired of battling it!

http://dl.dropbox.co...SV%20Folder.zip

Edited by for(imstuck)
Link to comment

hoovah is right, LabVIEW does lazy memory deallocation and that is not really a bad thing. Every roundtrip to the OS to deallocate memory that often will need to be allocated a little later on is a rather costly operation. The drawback is, that once LabVIEW hangs on to memory it pretty much keeps that memory for as long as it can but it is usually (baring any bugs) reusing that memory quite efficiently when necessary. This could be bad for other applications if you have a LabVIEW memory grabber in memory and don't want to quit it, but despite of not currently munching on huge data, not giving the other application enough memory. It has also another positive side besides performance in that once LabVIEW was able to get the memory, another application will not be able to grab that memory and make a 2nd run of the memory hungry LabVIEW operation suddenly scream about not enough memory.

TDMS is a nice tool but you have to be aware that it can mount up to a lot of data and that reading in this data in one big glob can very easily overrun even modest configured system resources.

Link to comment

Thanks for your help guys. It seems to be working OK. I changed my code so I read from tDMS and write to a CSV file in chunks rather than reading the whole TDMS and building a large array then converting to a spreadsheet string and doing one file write. Not only does this seem to help the memory management, but I think you were right about LabVIEW being lazy but somewhat smart enough to reuse that same buffer because as I exited each subVI there was some deallocation done, and after the first subVI was run, although this allocated memory hung around, it didn't seem to increase. Unfortunately I chased what seems to be a non-existed problem (i.e. " It's easy to theorize what it should be doing and convince ourselves what it's probably doing, but that's not always the case. Usually, it does make more sense to just test it out and see what happens.") So anyways, good to know for future reference!

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.