Jump to content

Rolf Kalbermatter

Members
  • Posts

    3,786
  • Joined

  • Last visited

  • Days Won

    245

Everything posted by Rolf Kalbermatter

  1. So I did take a look and yes it was that function but no, it doesn't only fail in 64-bit mode but also in 32-bit mode. So I'm a little lost why you feel it did work with an UNC path when using it in LabVIEW 32-bit. Going to do some more tests with this and trying to clean up a few related things.
  2. Error 1 is the all generic "invalid parameter" error. could be indeed a problem in interpreting UNC paths somehow. I'll try to look into that. Haven't really run the code yet on a 64-bit system with UNC paths, but I see where the error 1 seems to come from when looking in the source code. It looks like LabVIEW has changed its stance about what an UNC path represents between 32-bit and 64-bit. I use the function FIsAPathOfType() to check that the passed in path is an absolute path ( I do not want to try to open relative, and of course empty or invalid paths, as I have no way to know where they should be relative to, and find the idea to use the current directory an atrocity that has absolutely no place in a modern multithreading application). Going to verify that this is the culprit, as it could be also from somewhere else, but it looks suspicious and I know that internally UNC paths are treated as a different type in LabVIEW, but so far it considered them absolute too (which they are).
  3. Some people would say that that is your problem. Others that it is a bliss. 😀
  4. I'm pretty sure the .NET RTF control is not much more than a fairly simple wrapper around the actual RichTextEdit Control which is pretty much a Windows Common Control component. Pretty much everything of the business logic is in the according Windows DLL and the API is exposed as macros around the Windows messages that you send to the control.
  5. There is a reason that it is still marked Beta (and likely will remain so for the foreseeable future). It is a telltale sign that even the RichTextEdit control, which is a Microsoft technology has problems with that setting. Basically enabling UTF8 as a codepage feature would be a nice idea, IF all Windows applications were properly prepared to work with codepages that can be more than 1 byte per character. But since this simple assumption of 1 byte == 1 character works for all English speaking countries, there have been many sins committed in this respect and nobody ever noticed. Enabling this feature tries to solve something that can not really be solved since there is simply to much cruft out there that will fail with it (and yes LabVIEW has also areas where it will stumble over this). Linux is in that respect a bit better of. The Linux developers never were shy about simply abandoning something and put people up with facts and tell them, this is how we will do it from now on. Take it or leave it but don't complain if it doesn't work for you in the future if you do not want to follow the new standard. Most desktop distributions nowadays simply use UTF8 as standard locale throughout, pretty much what this setting would do under Windows. And distributions simply removed applications that could not deal with it properly.
  6. That grammer sounds almost as bad as what those Nigerian scammers use, who pretend to have embezzled a few millions and now are eager to find someone who would be happy to take that money from them. 😀
  7. I don't know about VLA. Never used one myself. Our company is on a Partner Software Lease contract, which has been an annual subscription based license for as long as I remember. In theory I don't have to care about all this as long as I'm employed at Averna, but I do care about LabVIEW and think it is a bad move for people who are not under such a company provided license agreement with NI. That the justifications that NI gave for moving to a subscription based license model only, almost all sound to me like marketing jumbo-mumbo that tries to turn the entire meaning of words upside down, or are actually completely misconstructed arguments, didn't help that at all. For normal perpetual licenses it is definitely how it works. If you make use of the NI offer to extend your expiring SSP for up to 3 years of subscription licensing for the price an SSP was in the past (about half of what a yearly subscription costs now) your perpetual license automatically converts to a subscription license. Instead you could choose to buy a new subscription license for the full cost and let your existing SSP expire. In that case you own the perpetual license from your old license, which gives you the right to install and use LabVIEW 2021 and in addition to that a subscription to the newest LabVIEW version for as long as you keep your subscription active. Once you let the subscription expire you still have the perpetual license for LabVIEW 2021 but can't (easily) look at all the VIs you may have created with newer LabVIEW versions under the subscription model. For VLAs a different solution may exist but as I said I never had to deal with VLAs myself and have absolutely no knowledge about them.
  8. Ahhh well! Yes that was a choice I made at that point. Without a predefined length I have to loop with ever increasing (doubling every time) buffer sizes to try to inflate the string. But each time I try with a longer buffer, the ZLIB decoder will start filling the buffer until it runs out of buffer space. Then I have to increase the space and try again. The comment is actually wrong. It ends up looping 8 times which results in a buffer that will be 256 times as large as the input. That should still work with a buffer that has been compressed with over 99.6% actually! The only thing I could think of is to increase the buffer even more aggressively than 2^(x+1), maybe 4^(x+1)? That would with the current 8 iterations offer an up to 65536 times as big inflated buffer for an input buffer. In each iteration the ZLIB stream decoder will work on more and more bytes and then if it is to small, all will be thrown away and started over again. A real performance intense operation and I also do not want to loop indefinitely, as there is always the chance that corrupted bits in the stream might throw the decoder off in a way that it never will terminate and then your application will be looping until it runs eventually out of memory which is a pretty hard crash in LabVIEW. So if you know that your data is going to be very compressible, you have to do your own calculation and specify a starting buffer size that is big enough. If you do this over network I would anyhow recommend to prepend the uncompressed size to the stream. That really will help to not destroy the performance gain that you tried to achieve with the ZLIB compression in the first place.
  9. Without a more qualified statement about how you get to this conclusion such as what numbers are used, there is no way I can believe this. If you look at other indicators such as participation in the various forums, NI, LavaG and LabVIEWForum.de all I can say is that those numbers look VERYYYYYY much lower than a few years back. So either all those new users that are added year over year are real cracks who do not need any support of any kind, or NI has a secret support channel they can tap into, that us mere mortals do not have, or something is totally off. The public visible exposure of LabVIEW, just as NI itself, definitely has been diminishing in the last 5 years tremendously. Maybe all those new users are inherent user licenses included with the semiconductor test setups that are sold. Buying LabVIEW on the website is an almost impossible exercise recently and getting informed quotes also.
  10. There is another "little" culprit, and its the most likely reason for this discrepancy. LabVIEW only uses 8-bit ASCII text and accordingly only posts a so called ANSI (that's what Windows calls it when you use an 8-bit codepage encoding) to the clipboard. Notepad and Notepad++ are definitely Unicode applications. While they may enumerate clipboard data formats and only request ANSI if there is no Unicode string format in the clipboard, they almost certainly will use the MultiByteToWideChar() Windows API to translate the text, and if they do request Unicode anyhow, Windows will be helpfully translating it for them using that function. But this function will terminate converting a string on the first occurrence of a NULL character. Most code doesn't bother to check if the translated code has consumed all the input bytes. It's also not trivial to do, as the function returns how many codepoints it placed into the output buffer, but that does not have to match the number of input bytes, since some ANSI encodings can use more than one byte for some characters, and the used UTF-16 standard in Windows can theoretically generate more than one codepoint per character for certain very rarely used characters. For instance the MUSICAL SYMBOL G CLEF is outside of the 16-bit code range that UTF-16 can represent in a single codepoint. So if you want to preserve possible input strings beyond an embedded NULL character, things get fairly hairy when using the Windows conversion function as you would have to call it repeatedly on each individual text section that is separated by a NULL character. But trying to build your own conversion routine is an even worse idea. Nobody in his sane mind wants to do encoding translations themselves. 😀
  11. My projects usually have one or two folders called Tests and Junk. Tests are VIs that I create to test certain functionality. For instance in a recent project I created a number of test VIs for various subVIs that I used in an FPGA program. These are typically not real tests in the sense of Unit Tests but more a test bed to easily run the VIs interactively and test functionality and improvements as well as behaviour of the various functions. Junk I put VIs in that I sometimes create for a quick and dirty test of some function, occasionally also VIs that I might create for helping in a forum post while waiting for the FPGA compiler or some tests to finish. Outside of these two folders there is usually almost never any unused VI. I make a point to regularly check for VIs that are not anymore used and to simply delete them (or sometimes move them into the Junk if I think there might be some future possibility that it is needed again), but most are left overs from earlier attempts of reworked VIs that are now used in the program, so they can safely go away. And of course everything gets regularly checked into Version Control, with some more or less useful commit message. 😀
  12. Hmmm, clipboard copy! That has a very good chance of trying to be smart and to do text reformatting. I would definitely drag the entire control with all the data from one VI to the other, which should avoid Windows trying to be helpful. As a control, LabVIEW puts it in an application private format in the clipboard together with an image of the control. LabVIEW itself can pull the private format out of the clipboard, other applications will not understand that format and pull the image from the clipboard. If you only select the text, LabVIEW will store it as normal ASCII text in the clipboard and Windows may try to do all kinds of things including trying to translate it to proper Windows text, which could replace all \r "characters" with \r\n and there is even the chance that the text goes through ASCII to UTF-16 and back to ASCII on the way through the clipboard and that is not always a fully 100% back and forth translation, even though they may look optically the same. Text encoding translations is a total pitta to fully understand.
  13. I can't guarantee that there is not some problem somewhere in a function, but I didn't find anything in my testing. How did you copy the deflated string? As binary data or as string? If as string, are you sure your transfer mechanism didn't do some text translation such as automatic \n to \r\n translation somehow? Did you use the LabVIEW Text File Read and Write functions to write your strings? A deflated stream is not a text string but a byte stream, no matter if LabVIEW lets you display it as a string. It is not a problem for LabVIEW itself as it does not use special characters such as a terminating NULL character. But if you are not careful and use the File Text Write and Read functions in line conversion mode, your binary stream gets of course modified and that destroys the integrity of the binary information as the inflate algorithme expects it (and checks it with CRCs too).
  14. Thanks for the feedback. I'll have to check how it would be possible to create a package that can be added as stream in NI-MAX.
  15. It actually made me poke at both accounts here on LavaG and the only posts done by both of them were clearly promoting this product and nothing else. Despite being told in one of the threads that it is only very loosely related to the thread at hand as the original post was about an Open Source alternative, which this clearly isn't. And in two threads the last reactions from both were within 24 hours. Agreed, it is as a follow up post from Omer to the post by Elena, but still all very marketing style. As I said I consider it bordering spammy, not actually spam, otherwise I would have reported it.
  16. Not so interesting! Both Elena and Omer have so far only posted to boost about the TVI framework. Omer supposedly as an employee of the company that sells TVI and with Elena I'm not so sure, but it could be also from their marketing department. 😀 While technically related it is bordering spam based on the repeated marketing style postings.
  17. LabVIEW realtime support is only existing in the never officially released OpenG ZIP Library 4.1. That package is only available as download from earlier in this discussion thread here on LavaG and over at the NI forum I believe.
  18. All I can say is that you are operating in the LabVIEW attic, which has many rusty nails sticking out and blank unisolated life copper wires. 😀 As Greg Mc Kaskle once said in some different words about this attic: We do reasonable efforts to protect the innocent from getting into it and getting hurt, but if someone is determined to go in there he should not complain if he gets hurt somehow.
  19. Well the OpenG ZIP tools don't really represent a lot of elements on the realtime. Basically you have the shared library itself called liblvzlib.so which should go into /usr/local/lib/liblvzlib.so and then you need to somehow make sure to run ldconfig so that it adds this new shared library to the ldcache file. When you install the Beta version of the ZIP Tools package you should get a prompt at some point for administrative login credentials (or an elevation dialog if you are already logged in as administrator) which is caused by the ogsetup.exe program being launched as PostInstall hook of the OpenG package. This ogsetup.exe program does nothing more than extract the different shared libraries into C:\Program Files (x86)\National Instruments\RT Images}\OpenG ZIP Tools\4.2.0. Depending on your target you need to copy the according liblvzlib.so from either the LinuxRT_arm or LinuxRT_x64 subdirectory to /usr/local/lib/liblvzlib.so on your target. That should be all that is needed although sometimes it can be necessary to also run ldconfig on a command line to have the new shared library get added to the ldcache for the elf loader. With the old installation method in NI-MAX this was taken care of by the installer based on the according *.cdf file in the OpenG ZIP Tools\4.2.0 directory. I tried to checkout the NI Package Builder but can't see how one would make a package for RT targets. I also only see the 20.5 and 20.6 versions of the NI Package Builder as last version, maybe that is why?
  20. So is there a problem with installing the OpenG ZIP tools for LabVIEW 2020 and/or 2021 on a cRIO at all or was that inquiry from Jordan something else? I can not place packages on the NI opkg streams so yes this library will always have to be sideloaded through NI-MAX in some way, unless you want to copy it over yourself by hand into the right directory and create the necessary symlinks on a command line and then run ldconfig. 😀
  21. I would guess that it uses the standard MgErr codes: 5 - file already open, 8 - permission error which might be a follow-up error, 6 - file IO error which means that something happens during reading or writing of a file that the programmer is not finding another error to map it too.
  22. Well VIPM originated from OpenG Package Manager and OpenG Package Builder which had several different incarnations before it was sort of abandoned and then taken over by JKI to build the VIPM from. However VIPM is not just a somewhat niced up OpenG Package Builder but uses a rather different concept of building packages. The OpenG Builder simply listed all the files and gave you various options to change for each file how it needs to be installed and where. Very flexible but also quite complicated to keep an overview. VIPM changed that to let you create file groups that are all handled as a whole. Much simpler to manage and configure for the user but somewhat more limited. But it does the job for 99% of the users perfectly. Very few people want a fine grained configuration down to file level. The only gripes I have with VIPM is that it still doesn't natively support to include both 32-bit and 64-bit versions of files and have them install depending on the LabVIEW version in which it is installed. As to trying to use OpenG Package Manager (and/or OpenG Package Builder) for Linux, I'm afraid that is going to be a major project in its own. Those two were never very actively used and tested on anything but Windows, except maybe some MacOS X enthusiast, once LabVIEW run on those machines. It always took some time for LabVIEW to support new non-Windows versions after they got available.
  23. Actually I have to admit that I have no idea how this works in 2020 and/or 2021. I haven't been working with these versions so far together with RIO projects. I know they changed the package format for installation of software packages from the CDF to the nipkg format either with 2020 or 2021 but I have no idea how one would go about building such a package. I checked the NI Package Builder but so far it only seems to support building packages for Windows installations. Not sure about how the package part for the RT distribution would work.
  24. I would not know if this type of low level control is even possible with the MPSSE controller in the chip. That is something you will have to take up with FTDI customer support as that is really very deep down in the belly of MPSSE.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.