Jump to content

sts123

Members
  • Posts

    28
  • Joined

  • Last visited

LabVIEW Information

  • Version
    LabVIEW 2025
  • Since
    2023

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

sts123's Achievements

Explorer

Explorer (4/14)

  • One Month Later
  • Week One Done
  • Dedicated Rare
  • Collaborator Rare
  • First Post Rare

Recent Badges

0

Reputation

  1. I have a LabVIEW VI that opens a REST client and, inside a While Loop, sends multiple HTTP POST requests to read data from sensors. The responses are converted to numeric values, displayed on charts/indicators, and logged to an XLSX file using Set Dynamic Data Attributes and Write To Measurement File, with the loop rate controlled by a Wait (ms). For timestamps I use Get Date/Time In Seconds that links to each of the Set Dynamic Attributes. The charts and displays appear to update correctly, but there are problems with the Excel logging. When the update rate is set higher than 1 sample/s, the saved magnitudes are wrong and are only correct at exact 1-second points (1 s, 2 s, 3 s, etc.); the values in between are incorrect. When the update rate is set to 1 sample/s, the values are initially correct, but after ~30 minutes the effective time interval starts drifting above 1 s. This looks like a buffering, timing, or logging issue rather than a sensor problem. I’ve attached the VI and would appreciate advice on how the VI should be restructured so that the values and timestamps written to Excel are correct and stable over long runs and at higher update rates. I am also attaching the vi where I tried to implement the buffer using the Feedback Node and Case Structure. However there is a problem as the Case Structure is never executed as the buffer at the start is 0. Json_getData_3.vi Json_getData_5_Buffer.vi
  2. I have this .vi that uses a multiple tasks, and recordings are saved into separate files for each tasks. Each task is meant to be recorded with different acquisition rates. From data postprocessing I realized that each task data are not time-synced. Is there any quick solution to fix this issue with my .vi? RR_Trigs_1.vi
  3. So, MATLAB or Python are more efficient to convert TMDS to .txt?
  4. You were right. How I could check if the there is not samples missing in-between splitting? This is my exporting vi: to_Asci_3_range_2.vi
  5. It does not throw the names as expected. to_Asci_3_range_2.vi
  6. My .vi works ok until input file is about 1GB in size. Otherwise, it doesn't run, throwing the error about memory. to_Asci_3_range.vi
  7. At the moment, when the tdms i split, every individual fiel has time stamp reset to 0 at the start. Is is possible to do it the way that each file has time stamps that are continuing without disruption right from the beginning?
  8. Any thoughts how can I add header to the file, that are the same name as channels names in the Task?
  9. I've done it like this. Not sure if this is efficient and can avoid missing samples during split. I am using high sampling rates. RR_Spliting_2.vi
  10. This worked as intended. Many thanks.
  11. Not sure what that is. Can you please guide me?
  12. I tired using Format into string, but then the result text file includes time stamp 4 times for each line of data, instead of doing just once.
  13. I have this .vi that converts the .TDMS to ASCII. However at the moment the time format is "00:00.000001 5.2041 0.2388 -0.0004 0.9106" for each line. How can I make it in seconds in scientific format so the data will show as "1e-6 5.2041 0.2388 -0.0004 0.9106"? Also how can the headers be added into the ASCII file? to_Asci_2.vi
  14. They use the same channels. OK, thanks. Good to know I cannot run two task with same channels simultaneously.
  15. Thanks for the .vi. Seem like the notifier is being passed to the new loop, as new .tdms file is created, but the loop inside the case structure is not being executed, hence the data is not being written into that file. The count stays at 0.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.