Jump to content

Error -2147024882, NI System Configuration: Out of memory.


Recommended Posts

Hello there,

Well, this is strange. I have a program that runs fine in dev mode and has always run fine as a built executable as well. Now I'm getting this strange error which I've not seen before. (The location of the error is valid, one of my vis). I don't suppose anyone else has come across it? I'm in LV2011.

A long shot but it'd be great if anyone can help,

Thanks,

Martin

post-13935-0-78756200-1314804619.png

I take that back, I can generate the error in dev mode. Should make life easier. I'll post back once I figure out the issue.

Link to comment
  • 2 years later...
Is it possible you're going into an infinite (or very very large) loop? Maybe a divide by zero going into the N terminal of a for loop?

...or you're reying to get more data into that task than the buffer can hold?

Hello Crelf,

 

I do get the same error with excel report generation. I deal with a large number of data (2 columns with 6 lacs in each). I append such 64 columns in 6 sheets.

However to work efficiently I dont write all data at same time rather I append just 2 column at same time.

 

Though during 3rd sheet its starts giving this error.

 

can you please tell me how should I deal with this ??

 

Your help will be truly appreciated. 

Link to comment
Hello Crelf,

 

I do get the same error with excel report generation. I deal with a large number of data (2 columns with 6 lacs in each). I append such 64 columns in 6 sheets.

However to work efficiently I dont write all data at same time rather I append just 2 column at same time.

 

Though during 3rd sheet its starts giving this error.

 

can you please tell me how should I deal with this ??

 

Your help will be truly appreciated. 

 

I'm afraid you tax the Excel engine to much. So you say you try to create an Excel Workbook which has 6 worksheets with 64 columns each with 6 million samples? Make the math: 6 * 64 * 6,000,000 = 2.3 billion samples with each sample requiring on average more than 8 bytes. (Excel really needs quite a bit more than that as it has to also store management and formatting information about the workbook, worksheet, colomns and even cells.)

 

It doesn't matter if you do it one sample or one colomn or one worksheet a time. The Excel engine will have to at least load references to the data each time you append new data to it. With your numbers it is clear that you create an Excel worksheet that never will fit in any current computer system. Aside from the fact that Excel had in Office 2003 a serious limitations that did not allow to have more than 64k rows and 256 columns. This was increased to 1048576 rows by 16384 columns in Excel 2007 and has stayed there since.

 

Here you can see the current limits for an Excel worksheet: http://office.microsoft.com/en-us/excel-help/excel-specifications-and-limits-HA103980614.aspx

 

You might have to overthink your strategy about how you structure your data report. What you currently try to do is not practical at all in view of later data processing or even just review of your data. Even if you could push all this data into your Excel workbook, you would be unable to open it on almost any but the most powerful 64 bit server machine.

  • Like 2
Link to comment
I'm afraid you tax the Excel engine to much. So you say you try to create an Excel Workbook which has 6 worksheets with 64 columns each with 6 million samples? Make the math: 6 * 64 * 6,000,000 = 2.3 billion samples with each sample requiring on average more than 8 bytes. (Excel really needs quite a bit more than that as it has to also store management and formatting information about the workbook, worksheet, colomns and even cells.)

 

It doesn't matter if you do it one sample or one colomn or one worksheet a time. The Excel engine will have to at least load references to the data each time you append new data to it. With your numbers it is clear that you create an Excel worksheet that never will fit in any current computer system. Aside from the fact that Excel had in Office 2003 a serious limitations that did not allow to have more than 64k rows and 256 columns. This was increased to 1048576 rows by 16384 columns in Excel 2007 and has stayed there since.

 

Here you can see the current limits for an Excel worksheet: http://office.microsoft.com/en-us/excel-help/excel-specifications-and-limits-HA103980614.aspx

 

You might have to overthink your strategy about how you structure your data report. What you currently try to do is not practical at all in view of later data processing or even just review of your data. Even if you could push all this data into your Excel workbook, you would be unable to open it on almost any but the most powerful 64 bit server machine.

Thank you very much for your concern, 

 

I actually said each column has 6 lacs(600000) data that makes total 0.23 billion samples and I believe excel can accommodate 600000 rows. 

 

I dont know exactly but is this possible ??

Link to comment

LavaG is kidding me right now...

Again:

I would recommend a database, however one of my colleagues implemented something similar in the past (Export huge datasets from an database).
You should carefully check the RAM utilization of your System (Do you keep all data in Excel + LabVIEW?).

I tested on my machine (LV2011SP1) with 4GB RAM and Excel 2013 using the ActiveX Interface. Writing 64 columns with 600.000 random DBL numbers in a single sheet, one row at a time. It works like a charm, however Excel now uses 650MB RAM

EDIT: Just checked what would happen if I build all data in LabVIEW first and then export to Excel: LabVIEW now takes ~1.6GB of memory + throws an error: LabVIEW Memory is full

The issue is due to the 32bit environment, where a single process can only handle up to 2GB of virtual address space, see: http://msdn.microsoft.com/en-us/library/windows/desktop/aa366912(v=vs.85).aspx

 

The issue is also caused by my way of memory management (reallocating memory each iteration!) Just checked again, I tried to copy all data at once, therefore trying to copy a contiguous block of 64x600.000 array elements (which might not be possible if there is no block large enough to contain this amount of data)... Anyways, the issue is clearly not Excel, but the implementation in LabVIEW (in my case).

Edited by LogMAN
Link to comment
Thank you very much for your concern, 

 

I actually said each column has 6 lacs(600000) data that makes total 0.23 billion samples and I believe excel can accommodate 600000 rows. 

 

I dont know exactly but is this possible ??

 

According to this site a Lakh is 100,000 and a Lac is 10 times more. Maybe there are different Lacs in different areas of India. I personally find it already complicated enough to distinguish between an American billion and an European billion. Don't think I'm going to really memorize the Indian huge numbers that easily, especially if it should turn out that they are not the same all over India. :lol:

 

For the rest LogMAN more or less gave you the details about memory management. Since your report generation is interfacing to the Excel engine through ActiveX it is really executing inside the LabVIEW process and has to share its memory with LabVIEW. As LogMAN showed one worksheet with your 64 * 600,000 uses up 650 MB RAM. 3 worksheets already will require ~1.8GB RAM just for the Excel workbook. That leaves nothing for LabVIEW itself on a 32 bit platform and is still very inefficient and problematic even on 64 Bit LabVIEW.

  • Like 1
Link to comment

I tested on my machine (LV2011SP1) with 4GB RAM and Excel 2013 using the ActiveX Interface. Writing 64 columns with 600.000 random DBL numbers in a single sheet, one row at a time. It works like a charm, however Excel now uses 650MB RAM

It works charm with me too till two Excel sheet. But as soon as third come in to picture it starts giving an error. The code uses only 720MB only. It does not have any problem if I bring down number of column data to 0.45M and raise number of sheet. 

 

The problem is having 0.6M data in a column and work with 6 sheets. 

Link to comment
Yep, we use million, milliard, billion, biliard, trillion, trilliard, and so on. Seemed very logical and universal to me until I went to the States! :D

Thousand, Million, Billion, Trillion, Quadrillion, Quintillion.  Each comma is a new unit which is every 10^3.  Always seemed logical to me but I've always lived in the US.

 

More info on Wikipedia.

Link to comment
It works charm with me too till two Excel sheet. But as soon as third come in to picture it starts giving an error. The code uses only 720MB only. It does not have any problem if I bring down number of column data to 0.45M and raise number of sheet. 

 

The problem is having 0.6M data in a column and work with 6 sheets. 

 

Good point!

I'll try that again tomorrow with multiple sheets...

 

Does it work better if you save the workbook after each sheet?

Link to comment

OK, here are the latest results:

This time I let the application create 6 sheets with 20 columns, 600,000 rows each (writing 2 rows at a time). Notice that the values are of type DBL (same as before). Memory for Excel build up to 1,3 GB, but everything still works perfectly for me.
 
My advice for you is to check if your RAM is sufficient, and review the implementation of the Excel ActiveX Interface. Be sure to close any open references!
Without more information of your actual system (RAM) and at least a screenshot of your implementation, it is most difficult to give any more advice. If you could share a sample project that fails on your system, I could test on mine and may check for problems on the Excel Interface.

 

Here is a screenshot of what I did (sorry, but I can't share the implementation of the Excel Interface)

Link to comment
OK, here are the latest results:

This time I let the application create 6 sheets with 20 columns, 600,000 rows each (writing 2 rows at a time). Notice that the values are of type DBL (same as before). Memory for Excel build up to 1,3 GB, but everything still works perfectly for me.

 

My advice for you is to check if your RAM is sufficient, and review the implementation of the Excel ActiveX Interface. Be sure to close any open references!

Without more information of your actual system (RAM) and at least a screenshot of your implementation, it is most difficult to give any more advice. If you could share a sample project that fails on your system, I could test on mine and may check for problems on the Excel Interface.

 

Here is a screenshot of what I did (sorry, but I can't share the implementation of the Excel Interface)

attachicon.gifWriteExcelSample.png

yeah with 20 column it do work fine however I need to make 64 column in each sheet it does not take much memory in my RAM (700 to 900 MB). My system specification is  8 GB RAM and windows 7 64 bit os.

 

Now If I create 6 workbook it run charm but the problem is 6 sheet in one workbook.

Link to comment
yeah with 20 column it do work fine however I need to make 64 column in each sheet it does not take much memory in my RAM (700 to 900 MB). My system specification is  8 GB RAM and windows 7 64 bit os.

 

Now If I create 6 workbook it run charm but the problem is 6 sheet in one workbook.

Unless you use LabVIEW for Windows 64 Bit, you can have 100's of GB of memory and your application is still limited to 2GB of memory maximum. This is inherent to the 32 bit architecture. Due to various components loaded by LabVIEW the limit of usable memory is usually more around 1.3GB. The 2GB is maximum usable memory and there needs to fit in your application, any modules it may use and some temporary management information for LabVIEW and Windows.

 

The Excel plugin being ActiveX doesn't really help this as ActiveX in itself is quite a heavy weight technology. But don't worry, .Net wouldn't be better. :D

  • Like 1
Link to comment
yeah with 20 column it do work fine however I need to make 64 column in each sheet it does not take much memory in my RAM (700 to 900 MB). My system specification is  8 GB RAM and windows 7 64 bit os.

 

Now If I create 6 workbook it run charm but the problem is 6 sheet in one workbook.

 

My bad, I already forgot it had to be 64 columns each page... I've tested again and hit the same limitation on the 32-bit environment (with Excel using up to ~1.7GB of memory according to the Task Manager). Your system is more than sufficient for that operation (unless as rolfk correctly pointed out, you use the 32-bit environment).

 

Unless you use LabVIEW for Windows 64 Bit, you can have 100's of GB of memory and your application is still limited to 2GB of memory maximum. This is inherent to the 32 bit architecture. Due to various components loaded by LabVIEW the limit of usable memory is usually more around 1.3GB. The 2GB is maximum usable memory and there needs to fit in your application, any modules it may use and some temporary management information for LabVIEW and Windows.

 

The Excel plugin being ActiveX doesn't really help this as ActiveX in itself is quite a heavy weight technology. But don't worry, .Net wouldn't be better. :D

 

This is absolutely right. There is however one thing: Excel runs in a seperate process and owns a seperate virtual memory of 2GB while in 32-bit mode. So you can only create a workbook of that amount, if there is no memory leak in the LabVIEW application. In my case LabVIEW will take up to 500MB of memory, while Excel takes 1.7GB at the same time. If the references are not closed properly, you'll hit the limit much earlier, because LabVIEW itself runs out of memory. In my case the ActiveX node will throw an exception without anything running out of memory (well Excel is out of memory, but the application would not fail if I would handle the error).

 

I have never tried to connect Excel over .NET... Worth a shot?

 

jatinpatel1489: as already said a couple of posts ago: Make use of a database to keep that amout of data.

If the data represents a large amount of measurement data, TDMS files are a way to go (I've tested with a single group + channel, with a file size ~7.5GB containing 1Billion values -> milliard for the european friends :shifty: ). You can import portions of the data in Excel (or use a system like NI Diadem.. I've never used it though).

If it must be Excel, there are other ways like linking Excel to the database (using a database connection). Last but not least you could switch to 64bit LabVIEW + 64bit Excel in order to unleash the full power of your system.

  • Like 1
Link to comment
I have never tried to connect Excel over .NET... Worth a shot?

 

Well, not sure it provides more options than the ActiveX interface. And there are quite a lot of Excel related hierarchies such as Microsoft.Office.Interop.Excel, Microsoft.Office.Tools.Excel, etc, but only the first one seems to include public accessible contructors for the various interesting objects.

 

Personally I think trying to interface to Excel through .Net is mostly an exercise with little merits other than learning about .Net interfacing.

Link to comment
  • 2 weeks later...
My bad, I already forgot it had to be 64 columns each page... I've tested again and hit the same limitation on the 32-bit environment (with Excel using up to ~1.7GB of memory according to the Task Manager). Your system is more than sufficient for that operation (unless as rolfk correctly pointed out, you use the 32-bit environment).

 

 

This is absolutely right. There is however one thing: Excel runs in a seperate process and owns a seperate virtual memory of 2GB while in 32-bit mode. So you can only create a workbook of that amount, if there is no memory leak in the LabVIEW application. In my case LabVIEW will take up to 500MB of memory, while Excel takes 1.7GB at the same time. If the references are not closed properly, you'll hit the limit much earlier, because LabVIEW itself runs out of memory. In my case the ActiveX node will throw an exception without anything running out of memory (well Excel is out of memory, but the application would not fail if I would handle the error).

 

I have never tried to connect Excel over .NET... Worth a shot?

 

jatinpatel1489: as already said a couple of posts ago: Make use of a database to keep that amout of data.

If the data represents a large amount of measurement data, TDMS files are a way to go (I've tested with a single group + channel, with a file size ~7.5GB containing 1Billion values -> milliard for the european friends :shifty: ). You can import portions of the data in Excel (or use a system like NI Diadem.. I've never used it though).

If it must be Excel, there are other ways like linking Excel to the database (using a database connection). Last but not least you could switch to 64bit LabVIEW + 64bit Excel in order to unleash the full power of your system.

Thank you very much for making this clear. :)

 

Still I am wondering what if I make 6 different workbooks and them combine all them into one. Like I open first workbook and loads data of second workbook into first one. Then close the second workbook open third and append its data into first and so on.

Is this possible using labview?

It is certain that I can not read entire workbook once rather copying one sheet into another would be feasible. 

 

To make clear what am I intending to kindly refer this page. 

http://smallbusiness.chron.com/merge-excel-worksheets-workbook-54664.html

 From step1 to step7

Edited by jatinpatel1489
Link to comment
Thank you very much for making this clear. :)

 

Still I am wondering what if I make 6 different workbooks and them combine all them into one. Like I open first workbook and loads data of second workbook into first one. Then close the second workbook open third and append its data into first and so on.

Is this possible using labview?

It is certain that I can not read entire workbook once rather copying one sheet into another would be feasible. 

 

To make clear what am I intending to kindly refer this page. 

http://smallbusiness.chron.com/merge-excel-worksheets-workbook-54664.html

 From step1 to step7

Your link explains a way to merge multiple sheets into a single workbook. You'll end up with a single workbook containing all data, same as before. The fact that you copy pieces differently does not change the fact that all data will be loaded into memory each time Excel loads the workbook (This is just how Excel works). Since the amount of data is the same, the problem will remain.

You'll always experience the same limitations no matter how you build your workbook.

Edited by LogMAN
Link to comment
Thank you very much for making this clear. :)

 

Still I am wondering what if I make 6 different workbooks and them combine all them into one. Like I open first workbook and loads data of second workbook into first one. Then close the second workbook open third and append its data into first and so on.

Is this possible using labview?

It is certain that I can not read entire workbook once rather copying one sheet into another would be feasible. 

 

To make clear what am I intending to kindly refer this page. 

http://smallbusiness.chron.com/merge-excel-worksheets-workbook-54664.html

 From step1 to step7

 

In other words to what Logman said: if you have a bag that is to small to put in 6 apples in their own little baggy, then try to put in 1 apple a time without baggy, you still won't be able to to put in all 6 apples!

  • Like 1
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.