I have an executable application that is running on an XP netbook. It is controlling a USB DAQ device from NI (don't have the model number off hand). It is attempting to do things in 10s of milliseconds and the timing, as you might expect is terrible. I understand that but, the timing is too short! This I don't understand at all. It is effectively setting some digital channels to one state for 200ms, changing the state after another 300ms and repeating this over and over again. The timing is all over the place but averages about 60% of what it is programmed from.
Any idea how bad timing winds up shortening these intervals? I am using the wait ms multiple function.
Any help, advice etc. would be greatly appreciated.