Jump to content

LabView Memory Full Error


Recommended Posts

I've written a large program to control an operation and gather data. This program could eventually need to run for months at a time and gather data. The control portion of the program has run for months with out errors. But the first test to gather data lasted about 4 hours before I got the LabView Memory Full Error and crashed the program. Now I'm trying to figure out what caused the memory full problem. I really don't think this is the issue, but I was hoping to run it past some experts to make sure I wasn't making a stupid mistake.

I'm writing to a text file with an .xlsx extension mostly for easy access and readability to several people. I know the file is going to be large. During most of the operation I'm taking data about every 5 minutes, but during ramp down cycle I'm gathering data every second for anywhere from 5 - 30 minutes.

 

I start by initializing a vi with the file path and initial data

image.png.afeedf57c829267d5803683f9693c075.png

Then i add data to the file during the program

image.png.3c17f99ca19e6941b69c52f7dd1bf21c.png

Finally I close out the file (If I ever get here!!!)

image.png.80445a208d5b0273aa6f15728bbd2687.png

 

Does anyone see any obvious stupid mistakes, or any reason this might cause a memory full error?

Thanks for looking...

Bob Harmon

Close.png

Init.png

Log Data.png

Link to comment
2 hours ago, Michael Aivaliotis said:

Stupid question. Why don't you open a file handle once then just keep the file handle open while writing, then close it at the end of the program? When you write to an open file handle, the file pointer stays at the last write point.

Yeah, this is what I would do too. I'm assuming the goal is to make sure that if the program aborts the file isn't corrupt, but you can do that with flush file. If you're taking data every second the continuous open/seek/write/close/open... is going to take a toll.

9 hours ago, rharmon@sandia.gov said:

The control portion of the program has run for months with out errors. But the first test to gather data lasted about 4 hours before I got the LabView Memory Full Error and crashed the program.

Whats the exact error you get? The modal out of memory error popup dialog, or a code? How big is the file at the time of the crash? Bigger than 2 GB? Are you using 32-bit labview on a 32-bit machine? Bigger than 4 GB? Are you using 32-bit labview on a 64-bit machine? Etc, etc. In theory file open shouldn't need to read the whole file into memory, but 🤷‍♂️

Easiest way to isolate this, though is to generate some representative chunk of fake data and write to a file in a loop relatively quickly (think every 150 ms rather than every second). If its the file code thats the problem, you should see the error within an hour.

2 hours ago, rharmon@sandia.gov said:

I'm always looking for better ways to crack a nut... How would you have written it differently?

I would just say its kind of bad form to sequence things like that, as its much harder to read ( as this thread shows) and much easier to make a mistake (for example if you didn't have that clear error in there, or if you added another case but didn't wire the path through [and on that topic, linked tunnels are your friend]).

Since we're talking about style, I'd personally do this:

  • Take most of the bulk of case i=1 and make a subVI called "create file name" and a subVI called "create log header" 
  • I definitely would not build the path manually as you are doing with the concatenate string function -- I'd use the build path VI
  • I'd wrap the folder check and the create file function into a reusable subVI -- I've never personally found a use for the default behavior where I ask the OS "make me a/file/in/a/folder.txt" and it responds "sorry bro that folder doesn't exist", so I always bundle the two together.
  • The timestamp formatting section can be improved in one of two ways:

If you do these three things, your create file case becomes just a few nodes and only takes up a small amount of space on the diagram...plus, you now have a "create file" function which you can reuse on future projects, and with some tweaking you have a "create standard file name" function you can use in the future as well.

9 hours ago, rharmon@sandia.gov said:

I'm writing to a text file with an .xlsx extension mostly for easy access and readability

Excel may be smart enough to handle this, but you should probably set the extension so CSV, TSV, or TXT -- xlsx is in fact a zip file with a specific format, so writing regular ascii text to it will  should not work. If you want to make a proper xlsx file (eg an excel file) you can use either the report generation toolkit (kind of slow, uses activex to connect to an excel instance so excel has to be installed on your daq computer) or an addon toolkit like xlr8: http://sine.ni.com/nips/cds/view/p/lang/en/nid/212056

If you have lv2018 and are at least familiar with python, you can use the python node + a small helper method to write to an excel file using xlsxwriter or openpyxl .

Edited by smithd
Link to comment
9 hours ago, smithd said:
9 hours ago, Michael Aivaliotis said:

Stupid question. Why don't you open a file handle once then just keep the file handle open while writing, then close it at the end of the program? When you write to an open file handle, the file pointer stays at the last write point.

Yeah, this is what I would do too. I'm assuming the goal is to make sure that if the program aborts the file isn't corrupt, but you can do that with flush file. If you're taking data every second the continuous open/seek/write/close/open... is going to take a toll. 

I normally use the open once - {carry on ref - flush file} - close at the end pattern as a rule too, but there could be an argument for the crooked way: the file is not locked for write by the OS in between writes. You never know what an end user might pretend to want to do, like editing the file externally during the test.

Edited by ensegre
Link to comment

Do you have the Desktop Execution Trace utility?  The first thing I would do if I had an out-of-memory would be to use this to look for "leaked" references.  I once had a failure after 4 weeks of running that was caused by an unclosed .NET reference in a once-a-second loop.

  • Like 2
Link to comment

Are you sure it comes from that VI?

As I understand this VI is called with different function on every call. the one that will be called the most will be "Log Data".

So I strongly suggest to create a unit test with your VI called multiple time with this function to validate if the leak comes from there. But I am pretty sure not...

What the rest of you project is doing? Other question... what is the file size generated?

Benoit

Link to comment

I want to thank everyone for taking the time and effort to help me here. It's going to take me a few to digest all the comments, but I'll try to answer some suggestions now.

1. Normally I would write one vi and hold the reference and just keep the file open while writing then close the reference when done, that said in this case I was worried about the size of the file and not sure if keeping the file open might cause an error in the first place. I should have tested this theory offline.

2. Question, what does the flush file do? How does this help me?

3. This is a 32 bit machine, and as usual I never even considered the restrictions. I'll look into this. I think this was the normal modal labview error popup, but it's gone now.

4. The Desktop Execution Trace is a great idea... I'll do that.

5. Not sure the leak comes from this VI, but it was the one I was most worried about from this portion of code. Most all of the code used in this portion of the code has been used and run for months without errors. But my problem could come in from anywhere... Hope the Desktop Execution Trace utility helps me here.

 

Again... I truly appreciate the input. I'm sure it improve my labview coding... Thank you all....

Link to comment
10 hours ago, rharmon@sandia.gov said:

2. Question, what does the flush file do? How does this help me?

It forces the OS to finish any pending writes. Closing the file does the same thing. So if the concern causing you to continuously open and close the file was to ensure that your data was safely on disk rather than pending, flush would give the same result. It sounds like that was not your concern, so flush has no benefit to you.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.