Cat Posted April 29, 2009 Author Report Share Posted April 29, 2009 QUOTE (neBulus @ Apr 28 2009, 08:43 AM) What was the method you used to narrow it down to the lvsound.dll?Not that I question that as the reason but I am curious what signs other than the memory jumping that got you looking at that dll? Sadly, the method I used was trying everything else, first. First, while I was out in the field, I finally saw the error occur; that helped make a believer out of me! Not that I didn't believe something that had been reported more than once, it was just that -- as it turned out due to the computer I was using -- I had never seen the actual conditions under which it was occuring myself. After that I spent a bit of time trying to figure out all the things that were different between how I run the software in the lab during development, vs. how the users run it in the Real World. I set up a matrix of the 5 main differences, and rewote/ran my software in several different configurations to test them all. I actually went into it thinking that it could have been multiple things interacting, so I probably ran a few more tests than I actually needed. Once I pinpointed the problem to running sound, I wrote an exe that would: Output to sound 1 second worth of data, 2 channels, i16, via an array constant (pre-generated with the random number generator and scaled to the card's output range), at a rate of 44100. Repeat in a while loop until you are bored, or until you get an out of memory error. I was running Task Manager and would record the memory usage of my little sound routine every half an hour. On my main test computer, it would be flat for 4 hours, and then around 4 1/4 hours start increasing at the rate of about 1Mb every few seconds. Very noticable. I then added a reset button to the routine. I would poke that button after the memory usage started spiking to reinitialize the sound, and it would run fine for another ~4 hours. I then took out the old sound functions and put in the new ones. I ran that overnight and it was still chugging away the next morning. That's pretty much it. As I said, I've tried this on multiple relatively new platforms (mostly varieties of IBM/Lenovo and Dell boxes) and gotten the same out of memory error. I've only tested it on one IBM T43, but from my experience of the past 5 years, I'm pretty sure it will run fine on all of them. Cat Quote Link to comment
ElijahKerry Posted April 29, 2009 Report Share Posted April 29, 2009 QUOTE (neBulus @ Mar 24 2009, 11:34 AM) 2) I believe it is Elijah Kerry form NI who is pushing those new tools. Since he is in the marketing side you can just ask for him when you call NI. Tell him that "Ben NI" said that "he (Elijah) can make recomendations on who to talk to to get the tool working the way it has to." Have your service request handy when you call for him since he can use that to escalate the call. Yup, I'm the guy. I've brought your complaints to the attention of the development team, but any additional information you can provide to help us reproduce your problems would be extremely helpful. We've addressed a number of known issues for the LabVIEW 2009 release, which is available for evaluation at http://ni.com/beta' rel='nofollow' target="_blank">ni.com/beta. There were several known issues in the 1.0 related to filtering and parsing extremely large datasets, which have been improved. Tracing a large application will quickly result in tremendous amounts of data and information, so we imposed the 1 million item limit to prevent users from using all the memory on their computer. However, this can be changed in the ini file, which should be located here: 'C:\Program Files\National Instruments\MAX\Assistants\Trace Toolkit\TraceTool.ini.' the parameter is MaxEvents. As with any software issue, if we can reproduce it here, we can fix it, so don't hesitate to let us know if you've encountered a specific problem. Quote Link to comment
valhalla Posted April 29, 2009 Report Share Posted April 29, 2009 QUOTE (ElijahRock @ Apr 28 2009, 02:34 PM) Yup, I'm the guy. I've brought your complaints to the attention of the development team, but any additional information you can provide to help us reproduce your problems would be extremely helpful. We've addressed a number of known issues for the LabVIEW 2009 release, which is available for evaluation at http://ni.com/beta' rel='nofollow' target="_blank">ni.com/beta. There were several known issues in the 1.0 related to filtering and parsing extremely large datasets, which have been improved.Tracing a large application will quickly result in tremendous amounts of data and information, so we imposed the 1 million item limit to prevent users from using all the memory on their computer. However, this can be changed in the ini file, which should be located here: 'C:\Program Files\National Instruments\MAX\Assistants\Trace Toolkit\TraceTool.ini.' the parameter is MaxEvents. As with any software issue, if we can reproduce it here, we can fix it, so don't hesitate to let us know if you've encountered a specific problem. Following up on what Elijah said, here are some more ideas for tracing large apps. The filtering is 1.0 was not very efficient on large datasets, especially if you tried to filter out a lot of data. If this is the case, the tool might appear hung, while it's actually filtering the data. Filtering the data also doesn't reduce the amount of data being published by LabVIEW, rather, it removes the events from the view. A better way might be to reduce the number of events you trace by going to the "Configure Trace Session" dialog and only trace those events that you think are needed. If you're looking for memory allocations, then this along with VI execution might be what you choose. Another trick to reduce the CPU resources consumed by the Trace Tool is to turn the Display Refresh Off while gathering data. If you could tell us more about the crashes you experienced, that would be very useful. Screenshots would be useful of any dialogs etc. to help us narrow down the problem. Right now we don't have any known crashes, but as Elijah mentioned several improvements have been made when it comes to filtering large datasets in the current beta. -tychoc Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.