Jump to content

grouping N-D data in Labview


torekp

Recommended Posts

I found one resource on clustering in Labview, using k-means:

http://forums.ni.com/ni/board/message?boar...uireLogin=False

and several non-Labview free software packages:

http://glaros.dtc.umn.edu/gkhome/cluto/cluto/overview

http://www.prudsys.com/Produkte/Algorithmen/Xelopes/

http://www.shih.be/dataMiner/screenShots.html

... and I'm just wondering if anyone has any suggestions. My data has over 1k dimensions and probably 10k samples (observed ordered N-tuples) and I want to cluster into some fixed number of groups, less than 10.

For a simple 2D example of clustering, here's some made-up data clustered into 3 groups:

post-4616-1224179598.jpg?width=400

Link to comment

QUOTE (torekp @ Oct 20 2008, 02:39 PM)

Thanks Anders,

That sounds like a really smart idea to save lots of computing time. Unfortunately it involves more programming time :unsure: - I guess I'll see how bad this k-means computation is, first.

If youre matrices is complete it can be done by SVD plus four of five other VIs.

Link to comment

QUOTE (Anders Björk @ Oct 20 2008, 03:16 PM)

Can you explain that more fully?

Here is my attempt to follow someone else's recipe for PCA (pp. 52-53). I made up some data, and the resulting factor weights (eigenvectors) and eigenvalues SEEM reasonable, but what do I know. (Not much.)

post-4616-1224679945.png?width=400

Link to comment

And here is my attempt to follow this recipe (thanks, Los Alamos!) using SVD. I changed from normalizing my data matrix to merely centering it, but I changed my earlier VI similarly and the results agree. Hooray! Does this mean I actually did this right?

post-4616-1224684045.png?width=400

According to the website, Vector S^2 is proportional to the variances of the principal components, so I'm taking that as a measure of how important each Component is.

Link to comment

QUOTE (torekp @ Oct 22 2008, 02:58 PM)

Can you explain that more fully?

Here is my attempt to follow someone else's recipe for PCA (pp. 52-53). I made up some data, and the resulting factor weights (eigenvectors) and eigenvalues SEEM reasonable, but what do I know. (Not much.)

post-4616-1224679945.png?width=400

Adding the vectors to a matrix common is to enable sorting of the eigenvecors in order from large to small, which i common in PCA.

QUOTE (torekp @ Oct 22 2008, 03:59 PM)

(thanks, Los Alamos!) using SVD. I changed from normalizing my data matrix to merely centering it, but I changed my earlier VI similarly and the results agree. Hooray! Does this mean I actually did this right?

post-4616-1224684045.png?width=400

According to the website, Vector S^2 is proportional to the variances of the principal components, so I'm taking that as a measure of how important each Component is.

A PCA-modell is usally like this X=TP'+E

Where T=S*(What you call scores now)

multiplying Scorevector1*S1, Scorevector2*S and so on.

Now youre PCA-model is factored in two instead of three variables.

Sorry for the late answer, had about too much work for a time.

Link to comment
  • 4 weeks later...

QUOTE (GregSands @ Oct 27 2008, 11:24 PM)

Wow! I haven't tried yet, but I imagine so.

It figures. Every time I reinvent the wheel, either NI publishes the same thing a little later, or I discover it in OpenG or something.

Usually a lot more robust & versatile than my version.

Link to comment

QUOTE (torekp @ Nov 20 2008, 02:48 PM)

...

It figures. Every time I reinvent the wheel, either NI publishes the same thing a little later, or I discover it in OpenG or something.

Usually a lot more robust & versatile than my version.

Ditto that!

I developed an architecture that would allow arbitrary data paths between VI to allow customers to compose a system from building blocks that I provided. I spent weeks getting it all to work correctly, in BridgeVIEW 2.1 (LV 5.1). Before the ink was dry on the paper NI released LV 6.0 that feature control references and completely invalidated all of my work. :headbang: Funny, but actually threw away the disks with backups of that code and enjoyed doing it. :thumbup:

Sometimes you are better off being the second person with the idea.*

Ben

*Grand unification theory is probably one of those exceptions.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.