Jump to content

jatinpatel1489

Members
  • Posts

    7
  • Joined

  • Last visited

Posts posted by jatinpatel1489

  1. My bad, I already forgot it had to be 64 columns each page... I've tested again and hit the same limitation on the 32-bit environment (with Excel using up to ~1.7GB of memory according to the Task Manager). Your system is more than sufficient for that operation (unless as rolfk correctly pointed out, you use the 32-bit environment).

     

     

    This is absolutely right. There is however one thing: Excel runs in a seperate process and owns a seperate virtual memory of 2GB while in 32-bit mode. So you can only create a workbook of that amount, if there is no memory leak in the LabVIEW application. In my case LabVIEW will take up to 500MB of memory, while Excel takes 1.7GB at the same time. If the references are not closed properly, you'll hit the limit much earlier, because LabVIEW itself runs out of memory. In my case the ActiveX node will throw an exception without anything running out of memory (well Excel is out of memory, but the application would not fail if I would handle the error).

     

    I have never tried to connect Excel over .NET... Worth a shot?

     

    jatinpatel1489: as already said a couple of posts ago: Make use of a database to keep that amout of data.

    If the data represents a large amount of measurement data, TDMS files are a way to go (I've tested with a single group + channel, with a file size ~7.5GB containing 1Billion values -> milliard for the european friends :shifty: ). You can import portions of the data in Excel (or use a system like NI Diadem.. I've never used it though).

    If it must be Excel, there are other ways like linking Excel to the database (using a database connection). Last but not least you could switch to 64bit LabVIEW + 64bit Excel in order to unleash the full power of your system.

    Thank you very much for making this clear. :)

     

    Still I am wondering what if I make 6 different workbooks and them combine all them into one. Like I open first workbook and loads data of second workbook into first one. Then close the second workbook open third and append its data into first and so on.

    Is this possible using labview?

    It is certain that I can not read entire workbook once rather copying one sheet into another would be feasible. 

     

    To make clear what am I intending to kindly refer this page. 

    http://smallbusiness.chron.com/merge-excel-worksheets-workbook-54664.html

     From step1 to step7

  2. OK, here are the latest results:

    This time I let the application create 6 sheets with 20 columns, 600,000 rows each (writing 2 rows at a time). Notice that the values are of type DBL (same as before). Memory for Excel build up to 1,3 GB, but everything still works perfectly for me.

     

    My advice for you is to check if your RAM is sufficient, and review the implementation of the Excel ActiveX Interface. Be sure to close any open references!

    Without more information of your actual system (RAM) and at least a screenshot of your implementation, it is most difficult to give any more advice. If you could share a sample project that fails on your system, I could test on mine and may check for problems on the Excel Interface.

     

    Here is a screenshot of what I did (sorry, but I can't share the implementation of the Excel Interface)

    attachicon.gifWriteExcelSample.png

    yeah with 20 column it do work fine however I need to make 64 column in each sheet it does not take much memory in my RAM (700 to 900 MB). My system specification is  8 GB RAM and windows 7 64 bit os.

     

    Now If I create 6 workbook it run charm but the problem is 6 sheet in one workbook.

  3. I tested on my machine (LV2011SP1) with 4GB RAM and Excel 2013 using the ActiveX Interface. Writing 64 columns with 600.000 random DBL numbers in a single sheet, one row at a time. It works like a charm, however Excel now uses 650MB RAM

    It works charm with me too till two Excel sheet. But as soon as third come in to picture it starts giving an error. The code uses only 720MB only. It does not have any problem if I bring down number of column data to 0.45M and raise number of sheet. 

     

    The problem is having 0.6M data in a column and work with 6 sheets. 

  4. I'm afraid you tax the Excel engine to much. So you say you try to create an Excel Workbook which has 6 worksheets with 64 columns each with 6 million samples? Make the math: 6 * 64 * 6,000,000 = 2.3 billion samples with each sample requiring on average more than 8 bytes. (Excel really needs quite a bit more than that as it has to also store management and formatting information about the workbook, worksheet, colomns and even cells.)

     

    It doesn't matter if you do it one sample or one colomn or one worksheet a time. The Excel engine will have to at least load references to the data each time you append new data to it. With your numbers it is clear that you create an Excel worksheet that never will fit in any current computer system. Aside from the fact that Excel had in Office 2003 a serious limitations that did not allow to have more than 64k rows and 256 columns. This was increased to 1048576 rows by 16384 columns in Excel 2007 and has stayed there since.

     

    Here you can see the current limits for an Excel worksheet: http://office.microsoft.com/en-us/excel-help/excel-specifications-and-limits-HA103980614.aspx

     

    You might have to overthink your strategy about how you structure your data report. What you currently try to do is not practical at all in view of later data processing or even just review of your data. Even if you could push all this data into your Excel workbook, you would be unable to open it on almost any but the most powerful 64 bit server machine.

    Thank you very much for your concern, 

     

    I actually said each column has 6 lacs(600000) data that makes total 0.23 billion samples and I believe excel can accommodate 600000 rows. 

     

    I dont know exactly but is this possible ??

  5. Is it possible you're going into an infinite (or very very large) loop? Maybe a divide by zero going into the N terminal of a for loop?

    ...or you're reying to get more data into that task than the buffer can hold?

    Hello Crelf,

     

    I do get the same error with excel report generation. I deal with a large number of data (2 columns with 6 lacs in each). I append such 64 columns in 6 sheets.

    However to work efficiently I dont write all data at same time rather I append just 2 column at same time.

     

    Though during 3rd sheet its starts giving this error.

     

    can you please tell me how should I deal with this ??

     

    Your help will be truly appreciated. 

  6. I am trying to bind Indicator using DSTP server that gets data from Yudian controller AI706M.
    I want to get this data through yudian OPC server.

    when I do this in labview 7.1 version with windows XP OS, I get option of yudian OPC server from browse menu available at Indicator's property and It works 

    but with Labview 2012 and window 7 I am not getting this option ??

     

    I would appreciate your help on this.

     

    Thank You

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.