Jump to content

Determining free memory


Recommended Posts

Hi

I have laaaarge binary file (array of clusters). Not possible to load at once. Read from binary file generates an error (error code 2) and message box "Not enough memory to complete operation" pops up. Ok, I want to load as much as possible (with reasonable margin) - so how to determine what is the largest array I may allocate (at the moment, not generally). What I already tried:

  1. Memory manager functions to determine free memory (AZFreemem, AZMemStats) - always return 0 (I found that they are Mac specific functions)
  2. Trying to allocate big memory block with memory manager decreasing its size each time when it was unsuccessful (and freeing it again on success) - the size of largest allocatable memory block acquired that way was always much larger than array I was able to initialize (attempt to initialize to large array causes pop up message, but no any programmatic feedback)
  3. Sequentially decreasing the array size provided to "Read from Binary File" and parsing error cluster - pops up message each trial (unavoidable) and sometimes does not generate any error, but array loaded is empty

So what is best method to programatically determine largest possible array?

LabVIEW 8.2

Link to comment

Welcome to my world...

I love LabVIEW dearly and will defend it as a great all-around programming language, except in one area -- memory management. And that's a killer for those of us who have to manipulate/display large data files.

My (rather vague) understanding of the problem is that LV generally wants to grab *contiguous* memory. So while you may have plenty of free memory, if it's broken up, LV can't necessarily use it. (I'm sure if I'm wrong someone more knowledgable will jump in here...)

My workaround for this is to read the available physical memory, subtract off about 750MB for what I label "LV inefficiencies" and divide the remainder by a "memory loss" value. Trial and error with this has left me with a value of 4 for memory loss.

(Available memory - program usage) / memory loss = #bytes of hopefully contiguous memory out there somewhere

This is a cludge and isn't really determining what I need to know (available contiguous memory) and I hate it. But it works, most of the time. If anyone else has a better way to do this, I would love to hear it.

Link to comment

And how do you read "available memory" and "program usage"?

Program usage I get from eyeballing how much memory just running the program takes up in Task Manager. Then I throw in a couple hundred MB because LV has a bad habit of grabbing memory and not letting it go (even if you use the "anxious deallocation" function). This all should be a fairly constant value, depending on your program. For my program that's about 750MB. Like I said, this is Bad Programming and not how I would like to be doing it if there's a better solution out there.

I have a set of Windows API llbs that I've had so long that I don't remember where they came from. Available Physical Memory is in winsys.llb\System Information.vi

Windows API.zip

Link to comment

My (rather vague) understanding of the problem is that LV generally wants to grab *contiguous* memory. So while you may have plenty of free memory, if it's broken up, LV can't necessarily use it. (I'm sure if I'm wrong someone more knowledgable will jump in here...)

Cat is correct. Arrays require continuous blocks of memory.

Other languages have things called lists that allow you to load large sets of data in non-continuous memory, but LabVIEW has no such construct built natively into the language. These lists however are not arrays, and do not allow efficient random access to your data.

Do you need to load large amounts at once? Standard practice is to read the file in tiny chunks and iterate through the file while running whatever state calculations you need. Or do you need more random access to the data? In that case properly defining your data structure can allow random access to any element on disk with a little bit of math, or alternatively you might be able to "map" your file out ahead of time to allow random access to variable length data.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.