Jump to content

Bruniii

Members
  • Posts

    19
  • Joined

  • Last visited

Everything posted by Bruniii

  1. Do you mean 'orig' and 'json_string' ? The secondo one, json_string, is the string that the Python API receive from labview. The labview vi in my previous screenshot is a minimal example, but imagine that the json string created by labview is transmitted to the Python API; the python code is, again, a minimal example where the idea is that 'json_string' is the string received from labview trough the API. When I try to convert the json string, in python, the string in 'convert['test']' is not equal to `orig`/original string in the labview vi. While, the conversion of the same json string return the same original string if done by LabVIEW.
  2. Dear all, I have to transmit a JSON message with a string of chars to an external REST API (in python or javescript). In LabVIEW I have an array of uint8, converted to a string with "Byte Array to String.vi". This string has the expected length (equal to the length of the uint8 array). When I convert a cluster with this string in the field "asd" to a JSON string, the resulting string of the field "asd" is 1. different 2. longer. This is true using three different functions to convert (the NI Flatten to json, the package JSONtext and the JKI json serializer). If i convert the JSON string to a cluster, in LabVIEW, the resulting string in "asd" is again equal to the original one and with the same amout of chars. So everthing is fine, in LabVIEW. But the following code in Python, for example, doesn't return the original string: import json orig = r'ýÁ%VÚõÏ}ŠX' json_string = r'{"test":"ýÃ%VÚõÃ}ÂÅ \u0002X\u001C\u0000d3"}' convert = json.loads(json_string) print(convert['test']) # return ýÃ%VÚõÃ}ÂÅ Xd3 print(convert['test'] == orig) # return False Can anyone help me or suggest a different approach? Thank you, Marco.
  3. ...Maybe?! It works, but will it helps? I guess I will have to wait another year, statistically...... but thank you! It's the hw watchdog of the industrial pc, but I have no control on on the kick/reset/timeout, so no way to do any operation before the restart.
  4. To be clear, I'm not at all sure that the Flush File will solve the corruption of the file in the main application; but NI believes that Flush File is the "main reccomandation" in this situation, hence I thought that adding this simple and small function would be super easy...
  5. Hi Neil, thank you. In the main application the ini file is closed immediatly after the write, as you do. Still, three times in two years of running, after a restart of the pc (caused by a watchdog outside of my control) the ini file was corrupted, full of NULL char. Googling for some help I found the official NI support page that I linked that describes exactly these events and they are suggesting the use of the Flush File. But, as you said, the refnum from the INI library doesn't look compatibile with the input refnum of the flush function...and I'm stuck
  6. Hi, in a larger application, sometime a small ini files get corrupted (a lot of NULL char and nothing else inside), and it looks like it happens when the computer restart and the application was running. I found this NI page: https://knowledge.ni.com/KnowledgeArticleDetails?id=kA03q000001DwgyCAC&l=it-IT where the use of the "Flush file.vi" is suggested. But a vary minimal and simple test like this one: returns Error 1, always. And I really really cannot see what I'm supposed to do differently. It's problably something VERY stupid. Anyone here can help me? Thank you, Marco.
  7. The result of the "Self-Test" is "Theself test completed successfully" but nothing change. Even the reset is successful but than I get the same error. Nothing strange from the device manager, I think: You are correct, the chassis in this PC is the NI-9162, while on the others PC the 9171 is used. I totally forgot about it...still, it worked for months without problems. May I ask you how did you found out? I cannot see any clue from the screenshot that I shared in this topic! Is the reset on MAX the only "reset" available? Should I remove something after uninstall LabVIEW and before a new installation to clen everything and start fresh, hopefully without this error?
  8. I have updated the DAQmx divers to the same version installed in the working PCs but nothing has changed and I have the same error even after multiple restart of the PC:
  9. The PC with the error on the left; on the right, the same PC model with the same type of HW (9171 + 9215) connected, in a different location, and still working. Honestly, I don't know why the 2021 Runtime is installed in the working one. My program is compiled with LabVIEW 2020. I will try to upgrade the DAQmx driver to the 21.8 in the PC with the error. But everything else looks the same, to me.
  10. Hi all. After mounth of continuosly running with zero problem, now I'm getting the error "-200557 Specified property cannot be set while the task is running. Set the property prior to starting the task, or stop the task propr to setting the property". First I got the error on my program (basically, a QMH acquiring data from a DAQ module). But I get the same error from the MAX Test Panel of the module, even after many restart of the Windows PC and even after I uninstalled and installed again the DAQmx drivers and the LabVIEW runtime. This is the error and the MAX window. The module is a DAQ NI 9215 and is mounted in the cDAQ-9171 USB chassis. In other PCs where the same LabVIEW program is running, with the same combination of hardware (9171 + 9215) the typical MAX device list is like that: with the chassis 9171 listed and the 9215 is listed as a board in the chassis. Unfortunally the PC and the boards are in a remote location and, at the moment, I'm not able to phisically unplug the usb chassis or change the board with a different one. Can somebody help me, suggesting something else I could try, having only a remote access to the Windows PC? Thank you, Marco.
  11. The data are created by LabVIEW 😅 then streamed to a remote server for storage. Now I was writing the vi that has to read data to perform some offline analysis. Yes, I have already changed the vi from ShaunR and now it looks like this (forgot about the "u32" label) Thank you. Marco.
  12. The array of float is indeed an array of singles; after the "End-Of-Line" suggestion by dadreamer, my implementation is working fine. But yours is cleaner and very much appreciated. Thank you!
  13. 😶 That's a really nice point.... because I haven't thought about it enough, I guess. After disabling the End-Of-Line, everything is fine; and I'm sure that it will also using Read from Binary File. Thank you. Marco.
  14. Dear all, I'm trying to load files with blocks of consecutive binary data; each block has the following structure: At first I tried with a for loop to read the header of each block and than read the following "N sampls" floats, for each iteration of the loop. But, every time, after some iterations, something goes wrong and the payload/headers numbers became huge. Each files "breaks" at different points. Below two images: one is the first blocks (until everything is fine), the second is the full file converted where you can see the huge numbers. I have also tried with a much simpler method, it's the section on the top of snippet. The real problem is that the following Matlab code, running on the same file, has no problem and the converted array is as expected: fileID = fopen('file.bin'); A = fread(fileID,[1010 600],'float','b'); fclose(fileID); a = reshape(A(11:end,:),1,[]); figure, plot(a) Anyone can help me find the problem with the LabVIEW code? This is the binary file, if anybody wants to try: file.bin it fails after the block with index 46. Thank you, Marco.
  15. Ah, sorry... The "analysis" code is already in a dedicated thread of the QMH framework. The time to acquire the 2d array of 10000x100 elements is 360ms; while the total time to: take one row ( 1d array of 100 elements) take a small subset of 11 elements centered around the index of the max spline the subset from 11 elements to 501 correlate with a reference array and search the index of the max for 10000 times is consistently more than the acquiring time and the cpu usage is close to 80%. At the moment the only solutions is to decimate the 2d array to 2000 x 100 elements or 5000 x 1000 elements. I have already timed all the other steps of the "analysis" and the interpolation is responsible for >90% of the total time.
  16. Thank you. I realized I made a big mistake: "Spline Interpolant.vi" and "Spline Interpolation.vi" are indeed calling a dll primitive. I got confused with some other vi that I'm using after the spline interpolation. Hence I think that you comment is right: there are very few possibilities to reduce the time required, at least with the same hardware. (sorry for wasting your time). Unfortunately I cannot use any information from the previous measure.
  17. Hi, in my code I'm using "Spline Interpolant.vi" and "Spline Interpolation.vi" (the last one in a for loop) to interpolate an array of 11 elements and the final array has 501 elements. The time required (on my pc) is ~40 microseconds. I have to do the same operation thousand of times for every measure and, every time, I need to execute also "Splite Interpolant.vi" because the original array will change. I have tried others interpolation method available in LabVIEW but the time required is the ~ same or more; I don't know if there is something else I can try to do to increase the speed of the interpolation. But, at the moment, the time budget between two consecutive measure is very limited. I have looked at the code of the two vi's and, from my understanding, they are not calling some C/C++ primitive to calculate the 2nd derivative and interpolate; instead it looks like pure LabVIEW till the end. I found this C++ implementation https://kluge.in-chemnitz.de/opensource/spline/ but it's a "header-only" library and I don't know how to call it from LabVIEW without dll file. Anybody here can help me? can I expect a decrease of the required time? The linked library is just an example, I'm open to any other solutions/library/dll. Thank you. Marco.
  18. I had missed the talk, is it possible to watch it on demand?
  19. Hello, first of all: I'm new to web server in general (non only labview web server)! Starting from the official examples now I managed to realize very simple web server with two methods and SSL communication enabled; it' only a small test that I'm using to understand the basic, than I will create the fully featured one and implement it in my main application. I'm stuck with the authentication aspect of the web server. I was able to create a group and a permission, set a specific method to accept communication with this permission (auth with username/pw); but my understanding is that, choosing this type of authentication, only request from Internet Explorer (the only browser that's still supporting Silverlight) or from a LabVIEW client will work. In the final application the request to the LabVIEW Web Server will be performed from a client coded in python. Now I'm trying to configure a LabVIEW Web Server with API-based authentication. Unfortunately, even a simple LabVIEW client cannot successfully communicate with the server. Here you can find some screenshot of the LabVIEW WS settings, the web-based Configuration and Monitoring, the block diagram of the client and its front panel with the reply from the web server. The LabVIEW client works fine with the same GET string when the server has no authentication or the username/pw. When I try to use the API key solution the response to the client from the server is 403 - Forbidden. Any help? Thank you very much! Marco. Web server - security of the method getDynamicData. "Require API key" selected. Web server. URL Mapping of the method "getDynamicData" Web-based Configuration and Monitoring. Web services API key Block diagram of the LabVIEW client Front panel of the LabVIEW client
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.