Jump to content

Fixing the Waveform Graph


Stobber

Recommended Posts

A tongue-in-cheek name for the thread, but frankly it seems to cause more problems than it solves.

I have to display several waveforms of 1.5-3 million points each in several graphs simultaneously. I can't change this requirement. More specifically, I need to write that data, in chunks, to a System Waveform Graph in an application. It has to be a graph because of other restrictions (signals in the data will be edited in-place while other signals remain constant). A quick test VI shows that the Waveform Graph and Chart (in all their variants) choke with about 250,000 samples in the buffer. I found the LV Help article that explains the design of a max-min decimator, and I successfully wrote a single-waveform implementation that seems to work efficiently. (Code attached; the subVI is the decimator.)

My problem now is getting the rest of the graph to play nicely when I use the decimator to adjust the waveform being displayed. If I monitor the "Scale Range Change" event and fire a draw update notification when the x-scale changes, everything seems to work smoothly if I type in a new max value on the x scale by hand. But the graph doesn't play nicely when any of the following happens:

  • If I type a new min value on the x scale by hand, the graph scrollbar position goes out of control, and my waveform isn't in the viewable plot area.
    • If I try to drag the scrollbar back to my data, several "Scrollbar Range Change" events are fired, and the display gets totally messed up.

    [*]If I use any of the graph palette tools (the zooming and selection tools on the middle palette), the display gets messed up.

I wrote the decimator exactly as described in the LV Help, but I can't find any documentation on getting the graph to play nicely with a manually decimated waveform. I'm certain at least some of the people who post here have had to deal with the issue of displaying too-large data sets...how did you get the graph to play nicely in your application?

Code attached, in case you'd like to see exactly what I'm doing wrong. :)

Side question: How do you get the X scrollbar to size properly when the graph's pane resizes? Mine slides all over the pane. :(

Waveform Graph Decimation POC (LV11).zip

Link to comment

Suggestion: Hide the graph's inbuilt scrollbar and use separate System Horizontal Scrollbar. Monitor the events from the Graph and Scrollbar and on any change redisplay the data. In fact, I would only display the visible data in the graph: that which is in view and decimated if too many points are in view (no more than a couple thousand points). By keeping the number of points written to the Graph low you’ll get very fast updates, and the user won’t notice that the Scrollbar isn't an inherent part of the Graph.

This is an application of a technique described by mje for similar large-data issues with Listboxes.

— James

As an aside, I would use my new favorite tool, SQLite (here, or here), to actually hold and serve up the data. I believe one could even delegate the decimation to SQLite via an appropriate “GROUP BY” clause in the data “SELECT” statement. I’ve used mje’s technique and SQLite in an error and data logger that can handle large log databases very quickly (and very cleanly — complicated code like your decimation function becomes single-line SQL statements). The User cannot tell that the multicolumn listbox isn’t actually listing 30,000 log entries, even as they drag the scrollbar up and down.

  • Like 1
Link to comment

I’ve used mje’s technique and SQLite in an error and data logger that can handle large log databases very quickly (and very cleanly — complicated code like your decimation function becomes single-line SQL statements).

Wow, that would be incredible! I need to set aside time before my next project to learn SQLite so I can use it where appropriate. "Fixing" LV UI features with native code is a massive pain in the butt and generally a waste of my clients' money.

Yup. The SQLite API For LabVIEW comes with an example of decimating a waveform in real-time with zooming.

It carries a noncommercial license. How do I get a commercial license? None of the work I do is noncommercial.

Link to comment

There’s actually three LabVIEW interfaces to SQLite that you should look into: mine, Shaun’s, and one on the LabVIEW Tool Network by SAPHIR. I’m hoping to get mine into OpenG in the next few months.

Good grief, why are there three?! Is there no functional overlap among them?

Link to comment

Actually. There are more like 5 in total. Saphirs is fully commercial, mine is free for non-commercial and the others are free but generally have limited features and support on the various platforms (LabVIEW x64 for example). But all that is really for another thread (even though it is YOUR thread....lol).

Edited by ShaunR
Link to comment

Good grief, why are there three?! Is there no functional overlap among them?

You only need to use one; they all wrap the C interface. SQLite itself is public domain, and such a valuable addition to LabVIEW that I feel there really needs to be a package that is free and unrestricted.

Shaun provides a high-level LabVIEW API for interacting with the database (with VIs like “Select”, “Create Table” and the like) which may be of interest (though personally I think it’s better to work with straight SQL).

SAPHIR’s… well, it’s a company and you pay for it so perhaps you get better tech support.

— James

BTW, you don’t need a database to do the other parts of my original suggestion. It just makes it easier.

Link to comment

Back to original problem.

Fix #1 is to hide the graph's scrollbar.

Keep in mind you are only giving the graph data for the visible range of the X scale. This means that the X scrollbar built into the graph will be useless. This is designed to allow the user to scroll through all the data that the graph has when it doesn't fit in the visible area. You will never be in that situation. If you want to let the user scroll through all the available data, you will need to implement your own scrolling.

Fix #2 is to turn off "Ignore Time Stamp" on the graph.

This option means no matter what the time stamp is on your data, the graph X scale will start at 0. This means the user will not be able to tell where they are in the data. It also means that when you scale is showing 5:00 to 10:00, your data is being shown at 0:00 to 5:00.

With those changes your VI works pretty well except that you probably need to produce slightly more data. Currently there seems to be a zero value being produced in the visible area just before the scale maximum.

I also noticed that your decimation doesn't work well when the minimum is negative. The scales move but the data is produced as if the minimum is zero. Also your decimation produces zero values when showing area beyond the end of the data. If you can't make it just produce fewer elements in these cases, producing NaNs is an option. The graph will not draw anything when it encounters NaN.

As with the scrollbar, the graph palette option to fit to all data doesn't work. You might want to implement your own button to allow the user to zoom back out to all data.

Another approach to some of these problems would be to add another plot to your graph that just has the first and last points of your waveform in it with the correct delta to put them in the right spots. Set this plot to have no line and a simple point style. This way the graph will still consider itself to have data across the entire range, but you only produce detailed data for the visible area. This doesn't solve the "ignore time stamp" problem but always the scrollbar and fit to all data to work.

Link to comment

Not directly related to your graphing woes, but NI does have a decimation library that I've used before to good effect:

http://www.ni.com/white-paper/3625/en

giga_labview.llb at the bottom is the relevant bit.

It will change the dt dynamically depending on the decimation factor applied which may help alleviate some of your trouble here.

Mike

Link to comment

Not directly related to your graphing woes, but NI does have a decimation library that I've used before to good effect:

http://www.ni.com/white-paper/3625/en

giga_labview.llb at the bottom is the relevant bit.

It will change the dt dynamically depending on the decimation factor applied which may help alleviate some of your trouble here.

Mike

I saw that when researching this feature, but the whitepaper says it's implemented as an FGV. I need a multi-instanceable circular buffer and a stateless decimator so they can be used on several displays simultaneously. I wrote a one using a DVR and got it working nicely today; I plan to clean it and share it when I get some breathing room.

Link to comment

Yeah, I wound up taking the guts of their core decimation algorithm it and making it compatible with my own circular buffer implementation.

One of the neater things about it is that you can give it start time / zoom factor when you request a decimated array for display. I may have wound up making some edits in there though, I think the default feature was in percentages rather than absolute numbers.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.