Jump to content

Jordan Kuehn

Members
  • Posts

    690
  • Joined

  • Last visited

  • Days Won

    21

Everything posted by Jordan Kuehn

  1. Well I thought the unread posts would auto-update, now it looks like just the All Activity stream does that.
  2. One thing I really like is that some of the feeds (unread content) update automagically now.
  3. Without derailing too much, I've not found great use for the crio waveform library as my applications aren't this straightforward typically (though I imagine it could be used for this application). Is there something I'm missing?
  4. Please respond to the various replies ^. A sample file would go a long way.
  5. Check the discussion here. The initial post mentions host->target FIFO, but I mistakenly started a discussion regarding target->host FIFO. Size of buffers is not addressed, but Ned is correct to bring that up as well as frequency of data transfer. You should be pulling data frequently enough to need the check I illustrated with the screenshot, otherwise you are likely overflowing the buffer.
  6. Try re-saving your excel file as a .csv file.
  7. I haven't tried to make this work on RT, but definitely plan to. Thanks for spending some time diagnosing the existing problems.
  8. Correct, no intent to mock, sorry if it came off that way. It was redundant to message me specifically when your question was being addressed publicly, where others also gain the benefit of an answer.
  9. Also, I got a very similar question from the same person in a PM. I wonder how many PMs he/she sent out...
  10. Also instructions on how to reproduce. If they can't do that first you won't get far.
  11. I don't think any cloud platform will solve the problem with having to manually join your metadata to your raw data. Perhaps generate a file to go with the data as you produce it. Also, none of these are LabVIEW specific and as you've seen free only gets you limited space. In short, to my knowledge there is nothing that will do much more than what you are currently using.
  12. If you are currently using Dropbox, why don't you simply save and edit your work in the dropbox folder. Then you can share the specific folder containing your work with your professor. The same link should work as you update the contents of the folder. You can store your database in that folder as well.
  13. This last part doesn't seem to be working. The path control and any other test controls I drop down don't seem to be recognized. I've started working my way through your configuration code, but I thought I might check and see if there is a simple mistake I've made. Edit// Looks like some work is still to be done to get generic objects/controls to display as images. I've worked on that some and made it work. There are some other things I want to do as well. Perhaps I'll post my changes when I'm done, though don't count on me to take things over.
  14. The combined schema from drjdpowell's suggestions with the trigger I alluded to at the end. Works well enough right now while I'm playing around with it. I like the streamlined View and Inserts and simple interface with LV. Thanks for the help. BEGIN IMMEDIATE; -- wrap initial work in a single transaction DROP TABLE Buffer; DROP TABLE BufferIndex; DROP VIEW OrderedBuffer; DROP TABLE History; CREATE TABLE Buffer (Value); CREATE TABLE BufferIndex (I); CREATE TRIGGER BufferIncr BEFORE INSERT ON Buffer FOR EACH ROW BEGIN UPDATE BufferIndex SET I = ((SELECT I from BufferIndex)+1)%100; END; INSERT INTO BufferIndex (I) VALUES (-1); -- Initial index is -1 CREATE VIEW OrderedBuffer AS SELECT Value FROM Buffer,BufferIndex WHERE Buffer.rowID>I UNION ALL SELECT Value FROM Buffer,BufferIndex WHERE Buffer.rowID<=I; CREATE TRIGGER NewValue INSTEAD OF INSERT ON OrderedBuffer FOR EACH ROW BEGIN INSERT OR REPLACE INTO Buffer (rowID, Value) Values ((SELECT I FROM BufferIndex),NEW.Value); END; CREATE TABLE History (Value); CREATE TRIGGER AvgBuffer AFTER INSERT ON Buffer WHEN (SELECT I from BufferIndex) = 0 BEGIN INSERT INTO History(Value) VALUES ((SELECT avg(Value) FROM Buffer)); END; COMMIT;
  15. 1. I would download from NI directly or host the installer and download from there on the remote PC. Not a direct transfer. 2. We have had no issues using the RAD for RT and FPGA updates. What did you discover? 3. 4. 5. This has proven useful when we migrated several dozen tools from LV 2010 to 2013. Host PCs were used as backups at times during the migration, and sometimes they were of differing versions between tool and PC. Of course, other things may break instead... Props for successfully making a change of this magnitude remotely. We update our tools remotely, but always after testing the procedure on a couple that are in the shop for maintenance. Not really an option for this application.
  16. That looks very promising! I'll give it a try. I thought that using a VIEW might be a good approach for retrieval. Thanks! I've used your toolkit as well and will probably use it for this implementation.
  17. ORDER BY looks like it would work if I also include a tick count or something with my data. Works for me on the retrieval side, and that is a nice aspect to the DB. On the insertion side, I suppose I could create an auto-increment variable in the DB that would modulo with my buffer size and provide the rowid to UPDATE. I also like the idea of setting up triggers to say, average my buffer and store a point in another table each time it wraps around.
  18. That's a fair point and yes it behaves more like a fixed length fifo that crawls along in memory rather than a ring buffer. The UPDATE command would require some note-keeping of where the pointer(s) are on the LV side, but does sound like the *correct* way to do this. I would like to be able to insert element(s) and read the entire buffer already in order in LV without doing that. I think a VIEW would perhaps be the way to go for the retrieval side, I'm not sure about the insertion. As far as not using a db, that's also a fair point and I may not ever wind up using this for my stated purpose, but I think it could still be useful especially when combined with additional tables that perform some historical logging and such. At the very least it's getting me to explore the sqlite toolkits more.
  19. In the spirit of all the DB discussion yesterday which has coincided well with some tools I've been working on this week, I've got a question of my own. Essentially I just want to make a ring buffer stored in an SQLite database (sure, it could be any type of db) to buffer some data acquisition for use as pre-trigger data as well as a local window. With that in mind I added a trigger to ShaunR's Data Logging Example and increased the decimation factor high enough to not decimate (didn't want to mess with the rest of the example). The trigger code is below for a buffer of 1000 entries. Now, this seems to work fine and not blow up the size of the db or anything with my limited testing, but are there any issues with simply deleting preceding rows and allowing rowid to continue increasing? If there is an issue with this implementation, what proposed solutions do you have? I could create pointers and such and manually iterate around in LV when inserting, but I really want the db to handle this and have it be transparent to my application. I have an understanding of writing to and querying db's and use them a lot for test specs and results, dynamic field population, etc., but I haven't done much in regards to buffering or storing acquired data and am working on exploring how doing more work on the db side can make things easy on the LV side. DELETE FROM graph WHERE rowid%1000=NEW.rowid%1000 AND rowid!=NEW.rowid;
  20. Excellent intro presentation. I attended one of the earlier AF sessions at NI week in 2010 (I think) and came away with a good impression of it and it's power, but an article like this would have saved me a lot of time picking up the basics. A good follow on topic could be taking the pyramid structure of Actors shown in Part 3 and digging into that a bit and effective communication techniques between parts of the system. Some focus on messages and routing perhaps.
  21. Yeah, this is an old thread, but I can't find what I'm looking for. Any SQLite support for the newer linux based RT targets?
  22. Definitely pursue a time and materials basis like ShaunR outlined. Make sure to have clear milestones with time and cost estimates (often our first milestone is purchasing all of the hardware) and then not only get a sign-off, but try to negotiate so that you get paid after each milestone. This serves two purposes, it helps keep you from fronting too much time or money, and secondly it is a very firm indicator that they agree that you have accomplished that milestone to their satisfaction (deliverable and actual cost). If you run into trouble down the road (total # hours differs too much from the estimate for example) you can point to that sign-off and payment as proof that you and the customer both agreed to progress on the project after that point. That being said, don't make milestones too small. Depending on the size and scope of the project you might look at 20-80 hours of work. You don't want to be spending a ton of time writing invoices and making them cut checks.
  23. Even simple FPGA code can take quite a lot of time to compile. There are various strategies for maximizing each compile, and minimizing the compile time. You can use a Windows PC to program all the cRIOs. If you can use the scan engine, you can get a simple to moderate level system working without too much pain. Adding the FPGA can add significant development time, especially if you have never programmed one before. Here is the NI development guide that is a good resource as you have some more detailed questions. http://www.ni.com/compactriodevguide/
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.