Jump to content

Raymond Tsang

Members
  • Posts

    36
  • Joined

  • Last visited

    Never

Raymond Tsang's Achievements

Newbie

Newbie (1/14)

0

Reputation

  1. Hi LAVAs, I want to write a VI which "touches" a file. By "touch", I mean, to update the "last modified time" of a file without modifying its content. Using System Exec.vi, I made a simple VI (attached screenshots), but it doesn't do what it is supposed to do and return no error. I'm really puzzled . Any ideas? Ray
  2. Thanks for your advice. I've ordered a new card... PCIe 6535 with more on board memory (64k).
  3. QUOTE (jdunham @ Mar 3 2009, 07:50 AM) Thanks for your help, but it looks like it doesn't "double pulse". I'll play with for a while, or maybe just buy a new card, e.g. PCIe-6535, as a last resort. Ray
  4. Hi all, I'm using a PCI-6250 to pulse an LED at ~1kHz frequency. The pulse train looks like: 1 0 1 0 0 0 0 .... (all zeros until another pulse ~1ms later) Each bit is 100ns. Simple calculations tells me that I need to put 10k bits to the on board memory every 1ms which is more than the 2k bit limit on the board. (btw, I can't find any DIO board with on-board memory > 4k) If I do not use on board memory, LabVIEW complains about the data transfer is not fast enough. Since from the 4th bit onwards, the pattern is just plain zeros, is there a way to work around the memory limit problem? Thanks so much! Ray
  5. QUOTE (JamesP @ Feb 19 2009, 07:05 AM) Thanks for your suggestion, James! I tried to remove "clear task" and added "create task" with "auto cleanup" disabled. However, after the VI ended, the LED pulsing stopped, and when I started it again, LabVIEW complained about "resource being reserved" or sth like that. I'll keep trying... Ray
  6. Hi LAVAs, I encountered some problems when I'm trying to generate digital pulse trains to drive an LED via a PCI-6250 card. I want to implement it as two separate VIs: one starts pulsing the LED, the other stops it. (I tried to set "Continuous Samples" in DAQmx Timing.vi, but pulsing stops when the VI quits.) What now I can do is to generate one pulse train per run of the VI. (Screenshot attached) Any thoughts or suggestions? Ray
  7. QUOTE (giopper @ Aug 30 2008, 01:04 PM) Thanks! I'll try it tomorrow when i'm in lab. Never expected this post can become this long... I really appreciate the help from you guys!! Thankful Ray
  8. QUOTE (Götz Becker @ Aug 28 2008, 10:22 PM) Nice info! However, I guess I already know where exactly the memory leak is. It leaks whenever I convert a number to a string, which I cannot avoid. I guess it may be a bug in LabVIEW 8.5, although unlikely? Do you guys think the memory leak will go away when I compile it as an application? Ray
  9. QUOTE (normandinf @ Aug 23 2008, 07:44 AM) Which version of LV did you used? Can you try really running it for say 10-12 hours at 1Hz? I also had no problem in "accelerated" data taking. Really appreciate your help! Ray
  10. At last, I managed to capture a screenshot of the error.. It looks like a LabVIEW internal error... Any idea what this means? Ray
  11. QUOTE (normandinf @ Aug 23 2008, 06:44 AM) You don't have to be sorry! I really appreciate your help and of course your time and effort to try out my program! Now I have ruled out memory leak as the cause, because i don't see any more memory usage increase over time, using "profile performance and memory". I tried to run the program overnight again. The File IO error occurred, as always, around midnight 12am to 1am, which is a common time when people surf the net before going to sleep. So I suspect that maybe somebody is using VNC or other stuff connecting to the DAQ computer. As to whether the problem is system-specific, I will run a disk check... or what else do you think of? Desperate Ray
  12. QUOTE (normandinf @ Aug 23 2008, 06:53 AM) Many thanks! Here it is. This is the unsimplified version.
  13. QUOTE (neB @ Aug 22 2008, 10:05 PM) Thanks! Fascinating! Ray
  14. QUOTE (normandinf @ Aug 22 2008, 10:33 AM) Even if I unwire the graph, the leak is still there. In fact I found whenever I convert the cluster into a string for writing to file in "datalogger", i have a memory leak in the "main program" as seen in "profile performance and memory". I tried the following things: 1. Unbundle cluster, cluster to array, array to spreadsheet string 2. Unbundle cluster, format string 3. Cluster to array, array to spreadsheet string, concatenate strings 4. Number to fractional string, concatenate strings, and other permutations. All leaking... Is the passing of a cluster to the datalogger from the main program causing it? Ray the Leaker
  15. I think the problem i have is memory leak. I turned off (using a case structure wired with a FALSE) each component of the vi one by one and look at the "Profile performance and memory" found that, the leak comes from a "Cluster to Array". Now the problem is how to convert a cluster to an array without using "Cluster to Array"??? :headbang: Puzzled Ray
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.