Jump to content

the difference of FFT analysis with large data in labview


hhtnwpu

Recommended Posts

Do you need an FFT that large? The reason I ask is that the figures you discuss are very large for an FFT, normally for a long time period we break it down into smaller chunks to FFT to see things change over time over the data set.

I would avoid the express VI, on these data sizes you need to avoid any data conversions which the express VI will cause. Between the other two I'm not sure of different advantages. If you are doing sound and vibration type analysis then I would use this as the results should easily feed into the other functions. To avoid the licensing of the toolkit though you could use the built in function.

There is another option but it is another toolkit which has some high performance functions to perform the FFT in GPU or multicore optimised to improve the performance if it becomes necessary (it can also perform the FFT on SGL data as opposed to DBL)

Link to comment

I'm also curious about this. Such large arrays could pose problems. A 10 M element DBL array is 80 MB, and 160 MB for the resulting complex data type. Do you actually expect to reliably be able to pull continuous chunks of memory larger than that, even on a 64-bit environment? I've done my share of working with large data sets and through experience learned never try to float around arrays like that. LabVIEW does not fail gracefully if it fails to allocate memory...

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.