-
Posts
7 -
Joined
-
Last visited
Content Type
Profiles
Forums
Downloads
Gallery
Posts posted by jatinpatel1489
-
-
OK, here are the latest results:
This time I let the application create 6 sheets with 20 columns, 600,000 rows each (writing 2 rows at a time). Notice that the values are of type DBL (same as before). Memory for Excel build up to 1,3 GB, but everything still works perfectly for me.
My advice for you is to check if your RAM is sufficient, and review the implementation of the Excel ActiveX Interface. Be sure to close any open references!
Without more information of your actual system (RAM) and at least a screenshot of your implementation, it is most difficult to give any more advice. If you could share a sample project that fails on your system, I could test on mine and may check for problems on the Excel Interface.
Here is a screenshot of what I did (sorry, but I can't share the implementation of the Excel Interface)
yeah with 20 column it do work fine however I need to make 64 column in each sheet it does not take much memory in my RAM (700 to 900 MB). My system specification is 8 GB RAM and windows 7 64 bit os.
Now If I create 6 workbook it run charm but the problem is 6 sheet in one workbook.
-
I tested on my machine (LV2011SP1) with 4GB RAM and Excel 2013 using the ActiveX Interface. Writing 64 columns with 600.000 random DBL numbers in a single sheet, one row at a time. It works like a charm, however Excel now uses 650MB RAM
It works charm with me too till two Excel sheet. But as soon as third come in to picture it starts giving an error. The code uses only 720MB only. It does not have any problem if I bring down number of column data to 0.45M and raise number of sheet.
The problem is having 0.6M data in a column and work with 6 sheets.
-
Now I know what a "lac" is. No. You need a proper database.
Thank you.
-
I'm afraid you tax the Excel engine to much. So you say you try to create an Excel Workbook which has 6 worksheets with 64 columns each with 6 million samples? Make the math: 6 * 64 * 6,000,000 = 2.3 billion samples with each sample requiring on average more than 8 bytes. (Excel really needs quite a bit more than that as it has to also store management and formatting information about the workbook, worksheet, colomns and even cells.)
It doesn't matter if you do it one sample or one colomn or one worksheet a time. The Excel engine will have to at least load references to the data each time you append new data to it. With your numbers it is clear that you create an Excel worksheet that never will fit in any current computer system. Aside from the fact that Excel had in Office 2003 a serious limitations that did not allow to have more than 64k rows and 256 columns. This was increased to 1048576 rows by 16384 columns in Excel 2007 and has stayed there since.
Here you can see the current limits for an Excel worksheet: http://office.microsoft.com/en-us/excel-help/excel-specifications-and-limits-HA103980614.aspx
You might have to overthink your strategy about how you structure your data report. What you currently try to do is not practical at all in view of later data processing or even just review of your data. Even if you could push all this data into your Excel workbook, you would be unable to open it on almost any but the most powerful 64 bit server machine.
Thank you very much for your concern,
I actually said each column has 6 lacs(600000) data that makes total 0.23 billion samples and I believe excel can accommodate 600000 rows.
I dont know exactly but is this possible ??
-
Is it possible you're going into an infinite (or very very large) loop? Maybe a divide by zero going into the N terminal of a for loop?
...or you're reying to get more data into that task than the buffer can hold?
Hello Crelf,
I do get the same error with excel report generation. I deal with a large number of data (2 columns with 6 lacs in each). I append such 64 columns in 6 sheets.
However to work efficiently I dont write all data at same time rather I append just 2 column at same time.
Though during 3rd sheet its starts giving this error.
can you please tell me how should I deal with this ??
Your help will be truly appreciated.
-
I am trying to bind Indicator using DSTP server that gets data from Yudian controller AI706M.
I want to get this data through yudian OPC server.when I do this in labview 7.1 version with windows XP OS, I get option of yudian OPC server from browse menu available at Indicator's property and It works
but with Labview 2012 and window 7 I am not getting this option ??
I would appreciate your help on this.
Thank You
Error -2147024882, NI System Configuration: Out of memory.
in Application Builder, Installers and code distribution
Posted · Edited by jatinpatel1489
Thank you very much for making this clear.
Still I am wondering what if I make 6 different workbooks and them combine all them into one. Like I open first workbook and loads data of second workbook into first one. Then close the second workbook open third and append its data into first and so on.
Is this possible using labview?
It is certain that I can not read entire workbook once rather copying one sheet into another would be feasible.
To make clear what am I intending to kindly refer this page.
http://smallbusiness.chron.com/merge-excel-worksheets-workbook-54664.html
From step1 to step7