Jump to content

Reds

Members
  • Posts

    48
  • Joined

  • Last visited

  • Days Won

    3

Reds last won the day on August 5 2022

Reds had the most liked content!

Profile Information

  • Gender
    Not Telling

LabVIEW Information

  • Version
    LabVIEW 2021
  • Since
    1999

Reds's Achievements

Explorer

Explorer (4/14)

  • Conversation Starter Rare
  • First Post Rare
  • Collaborator Rare
  • Reacting Well Rare
  • Week One Done

Recent Badges

8

Reputation

  1. Thanks Greg, appreciate it! I think I need to figure out how this MASM toolkit would be installed with our EXE on customer systems. If I can integrate the MASM installer into our LabVIEW installer package, this could work for us. I'll investigate... You make a good point about single precision being all that's necessary also.
  2. Interesting, thanks Rolf! So then....I wonder if lvanlys.dll is not written to call the multi-core version of the FFT in MKL? Intel vTune Profiler also provides the hard evidence: And if this is not a failing grade for lvanlys.dll, I don't know what is:
  3. But if I use Resource Monitor to look at the DLL's my LabVIEW-built EXE is calling, none of the DLL's have "MKL" in the filename. It seems to me like LabVIEW is using our old friend "lvanlys.dll" to perform FFT calculations. Can anyone confirm my suspicion??
  4. If anyone else is interested, here's some evidence from one of my machines, indicating that NI is installing various versions of the Intel MKL. They look reasonably up-to-date. So I'm still unsure why they're not taking advantage of multi-core...
  5. "Documentation is aspirational" is a great line that I'm totally stealing. 😂 I laid down the big bucks for the top-of-the-line 18-core PXIe-8881. I feel like the engineering gods are mocking me as 17 of the 18 cores are idle when I run my FFT. I mean, what is the point of an 18-core PXI CPU if NI's default math library can only use one of them? Is there any other T&M application besides math/analysis that would actually benefit from 18 cores? Maybe I need to use MatLab to access all 18 cores? The whole thing is kind of crazy if you ask me.
  6. Thanks Rolf. i was actually looking at switching to the Intel MKL (instead of LabVIEW native) in a bid to improve multi core performance. But if LabVIEW is already using MKL, I wonder why it doesn’t seem to take advantage of multi core for FFT?
  7. Does anyone know which math library LabVIEW uses to do the FFT and vector math operations? Has it been updated over the years to accomodate the latest Intel CPU extensions, or has it been static over time?
  8. Yeah, I wish that was possible. The problem is that a third party analysis application can't understand the first 100kB of the file, and so that software incorrectly concludes that the entire remainder of the file must be corrupt.
  9. The jumbo file is recorded with a bunch of header data starting at file offset zero. This header data is not actually useful, and it actually causes a third party analysis application to think that the recorded data is corrupt. If I can manage to delete only the header data at the beginning of the file, then the third party analysis application can open and analyze the file without throwing any errors.
  10. Yeah, I dug into the Microsoft docs on sparse files, and I don't think that technology is going to solve my problem after all. Cool stuff. Good to know. But doesn't seem like it's going to solve my immediate pain. I guess what's really needed is a way to modify the NTFS Master File Table (MFT) to modify the starting offset of a given file. But, I didn't actually see any Win32 API's that could do that. I'm sure it must be possible to do that with some bit banging, but I'd probably be getting in way over my head if I tried to modify the MFT using a method that was not Microsoft endorsed.
  11. Yes, we are indeed talking terabytes. Reading the original file and writing a new one will take many minutes. It will also require the storage medium to have terabytes of free space available to perform the operation. Maybe even a whole separate partition would need to be set aside with free space. "Copy only the parts you want to save" is certainly the obvious solution, but it's not a good one for really big files. Thanks for the Microsoft link to Sparse files. I"ll dig into that and learn more.
  12. For sure, the biggest hazard of LabVIEW is that it permits you to easily blur the lines between "data acqusition" and "user interface" as ShaunR points out. So dangerous. I guess a SQL database is one way to draw a hard line in the sand between these two components. But I actually prefer to do it with an API (implemented as a LabVIEW Packed Library). In my opinion, the healthiest architecture decision you can make up front is that your Graphical User Interface will only be allowed to call an API (which you define) in order to configure, read, update, and delete the acquired data. If you have *any* instrument driver or communications code "above" your API, then you've violated your architectural contract.
  13. Thanks for the ideas fellas, I'll report back on my progress. I guess I was hoping for some Win32 API that could tweak the NTFS tables to change the starting sector of a file (but I guess that would be too easy).
  14. Let's say you have a really big binary file. A file so big that it won't fit into your PC RAM. Now let's say you wanted to delete the *first* 100kB in that file, and leave the rest of the file alone. How would you do that? Can it be done quickly? Can it be done without creating a whole new file?
  15. They also say this in the annual report: "We have empowered hundreds of thousands of loyal users of LabVIEW, a unique graphical software platform optimized for engineers, and numerous other application software tools". But I don't see how that statement could possibly be supported by current facts, unless they're including everyone who has ever used LabVIEW at any time in the last 30 years.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.