Jump to content

May I be excused? My brain is full (out of memory)


torekp

Recommended Posts

Sometimes I like to push my luck and read in a really big file full of data. Resulting in "Not enough memory to complete this operation" and "LabVIEW: Memory is full. Blahblah.vi was stopped ... yada yada". If I look at Windows Task Manager on the Performance tab, memory is taxed but not really overloaded; I guess Labview or Windows or somebody is being conservative about it. Not that there's anything wrong with that.

My program runs on different computers which may have different amounts of memory.

So, how do I tell in advance, how much memory I can suck up with LV before this would happen? Let's just stipulate that other processes aren't using much memory. Anyway, I can always consult that thread about CPU usage, to get ideas on how much page file bytes are being used; but that's not what I want to know right now. I know approximately where the top of my head is, it's the location of the ceiling that escapes me.

Oh, and if you haven't seen the Far Side cartoon referenced in the title - or seen it lately - here you go.

Link to comment

QUOTE (torekp @ Oct 10 2008, 09:09 AM)

The first time I read that quote from Arthur Conan Doyle about memory being like an attic it was in "VMS Internals and Data Structures version 4" (hmmm... chapter 14? Memory management)

Predicting memory requirements is rough. What is easier is to charaterize the code requirements by testing it with different file sizes. But you already know this I'm sure.

I did notice something when working with LV 8.6 over the last couple of days. A clone of template VI running top-level can return that error (memory full) but the rest of the VI's can keep running (provided they aren't looking for memory at the same time). So.... you could use a template to "test" if you can read the file. If the template survives, its safe to rea the file.

I am very interested in what others have to say about predicting memory usage.

Ben

Link to comment

QUOTE (torekp @ Oct 10 2008, 08:09 AM)

Sometimes I like to push my luck and read in a really big file full of data. Resulting in "Not enough memory to complete this operation" and "LabVIEW: Memory is full. Blahblah.vi was stopped ... yada yada". If I look at Windows Task Manager on the Performance tab, memory is taxed but not really overloaded; I guess Labview or Windows or somebody is being conservative about it. Not that there's anything wrong with that.

My program runs on different computers which may have different amounts of memory.

So, how do I tell in advance, how much memory I can suck up with LV before this would happen? Let's just stipulate that other processes aren't using much memory. Anyway, I can always consult that thread about CPU usage, to get ideas on how much page file bytes are being used; but that's not what I want to know right now. I know approximately where the top of my head is, it's the location of the ceiling that escapes me.

If you're reading file contents into one giant array or string, the issue you have to deal with is not how much memory is left, or how much physical memory your specific computer has. The one important factor is how much contiguous memory LabVIEW's process has out of its 2GB allotment. Like most other programming environments, an array in LabVIEW can't be split up into different pieces in memory. So it's very possible that although LabVIEW is only using 500MB of its total 2GB, there isn't any one place in that memory that's more than, say, 350MB wide. If so, LabVIEW can't allocate a 350MB array, even though it has plenty of total memory left.

The lesson here is that it is not very useful to know the total amount of memory left, and very difficult to determine how much of that remaining memory is in any one place. Instead, it's much better to architect your application to not use one big array, but multiple separate smaller arrays.

Try reading your file in in 100,000 element chunks instead of all at once. You might be then able to store all the separate arrays in one queue for easy, efficient access later on. Or you can statically have multiple shift registers on a while loop for the various parts. It will be much much less likely that you'll ever get out-of-memory messages.

Other programming languages solve this problem with data structures such as linked-lists. This data structure stores data elements separately, not in one big chunk. Each element contains the relevant data, and a pointer to the next data element. So you can always traverse the structure to get any desired piece of data. And since the data's not contiguous, you almost never worry about running out of memory. The downside, of course, is that it takes a lot longer to get to a specific element within the data structure (so-called random access). With an array it's easy, because you know where the array starts and how big each element is, making it trivial to "jump" to the desired element instantly.

Link to comment

Disclaimer: This is not my area of expertise. If others know better, they may very well contradict what I post here.

How much memory you can use is different from how much memory you can allocate in a single block.

The limits for LabVIEW should be the same as for any other program that runs on your operating system.

Limits for total program memory:

If your OS supports virtual memory (and most do), a program can use about as much memory as it has space on the hard disk to use for virtual memory.

Limits for a single block of memory:

How much you can reserve depends upon whether you have a 32-bit or 64-bit operating system. On a 32-bit system, the largest block you can request is definitely capped at 2^31 bytes or roughly 2 gig. You cannot address more than that in a signed 32-bit number, which is generally what the memory addresses are represented as. There may be reasons why the number is capped lower than that, depending upon how much of the OS can be swapped to virtual memory. If the OS has determined that some part of itself must always be resident in memory then your largest block is 2 gig minus that OS locked up portion. On a 64-bit system, the address space is much larger. Much much larger. And you can request commensurately larger blocks. At the time of this writing, there is no 64-bit version of LabVIEW, but it has been highly requested by customers and NI is aware of the desire for such a version.

QUOTE (ragglefrock @ Oct 10 2008, 08:55 AM)

Just to be clear: the array (single block of memory) is the one that has the "random access" feature, and the linked list is the one that has "serial access" feature. Linked lists can be built in LV using LV classes. You can find an example program here:

http://forums.lavag.org/post-a7270-LinkedList.zip

Detailed explanation of how it works can be found here:

http://expressionflow.com/2008/01/07/recur...ures-type-safe/

Link to comment

QUOTE (neB @ Oct 10 2008, 09:22 AM)

I am very interested in what others have to say about predicting memory usage.

Matlab has a function that will tell you the size of the largest 10 contiguous blocks of free memory. You cannot have a single variable that is larger than your largest contiguous memory block. I assume it must be getting this memory from some Windows DLL call (although I don't know which one). If you could duplicate these system calls in LV, you might be able to get what you need.

  • Like 1
Link to comment

MATLAB also has a pack command

PACK Consolidate workspace memory.

PACK performs memory garbage collection. Extended MATLAB

sessions may cause memory to become fragmented, preventing

large variables from being stored. PACK is a command that

saves all variables on disk, clears the memory, and then

reloads the variables

QUOTE (Gary Rubin @ Oct 10 2008, 09:07 AM)

Matlab has a function that will tell you the size of the largest 10 contiguous blocks of free memory. You cannot have a single variable that is larger than your largest contiguous memory block. I assume it must be getting this memory from some Windows DLL call (although I don't know which one). If you could duplicate these system calls in LV, you might be able to get what you need.
Link to comment

QUOTE (Tomi Maila @ Oct 10 2008, 01:12 PM)

Do you mean that you've silently implemented type recursion to LabVIEW that the ExpressionFlow article is about?

No. But in my opinion, your explanation is useful. See, your pictures are of something you'd like to see, but along the way, your text -- by negation -- generally explains how things work today.

Link to comment
  • 1 year later...
I am very interested in what others have to say about predicting memory usage.

QUOTE (neB @ Oct 10 2008, 09:22 AM)

Matlab has a function that will tell you the size of the largest 10 contiguous blocks of free memory. You cannot have a single variable that is larger than your largest contiguous memory block. I assume it must be getting this memory from some Windows DLL call (although I don't know which one). If you could duplicate these system calls in LV, you might be able to get what you need.

Has any found a solution to _predicting_ 'memory is full' errors?

I know all about reducing memory usage and dividing up memory blocks, but I'd still like to know how to avoid being unceremoniously dumped from my application when it hits the limit. If I'm about to acquire a large dataset at the user's request and there isn't a large enough memory block available, I'd like to inform the user and recover.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.