In an application of mine I came into an issue where a queue ran full, so I want to start a little discussion about which data structures scales the best when the goal is to use RAM to store temporary data (a "buffer"). This is for Desktop and Real-Time (PXI, cRIO etc.).
My particular application is a producer-consumer type design, where the consumer can be delayed for a long time but will always catch up given enough time - the resource to support this is of course enough RAM, which in this case is a valid requirement for the system. The issue will present itself in many forms of course, so this might be of interest to a lot of you guys here.
My application unexpectedly gave me a memory allocation error (code 2 on error out from the Enqueue prim, followed by a pop-up dialog stating the same) after filling about 150 Mb of data in the queue (that took a couple of seconds). This system was a 32-bit Windows XP running LabVIEW 2010 SP1. Other systems could be 64-bit, LV 2009-2011, or it could be Real-Time, both VxWorks and ETS-Pharlap.
Is 150 Mb really the limit of a single queue, or is it due to some other system limit - free contiguous memory for instance? The same applies for arrays, where contiguous memory plays an important role for you running out of memory before your 2 or 3 Gb is used up.
And, at least as important; how do I know beforehand when my queue is about to throw this error? An error from error out on the Enqueue prim I can handle gracefully, but the OS pop-up drops a wrench in the gears - it's modal, it can stop most other threads, and it has to be acknowledged by the user. On Real-Time it's even worse.
Edited by Steen Schmidt, 19 November 2011 - 10:01 AM.