-
Posts
4,996 -
Joined
-
Days Won
311
Content Type
Profiles
Forums
Downloads
Gallery
Everything posted by ShaunR
-
The intent was to tell you the software was free (as in BSD licence) and, if had downloaded it, it contains a zlib binary that isn't wrapped.
-
I've never understood the "free (as in beer) or free (as in speech) internet vocabulary. Beer costs and speech has a cost. It's BSD3 and cost my time and effort so it definitely wasn't "free" Rolf's works on other platforms so you should definitely use that, but if you wanted to play around with functions that aren't exported in Rolf's, there is a zlib distribution with a vanilla zlib binary to play with while you wait for a new openg release.
-
Yes. Rolf like to wrap DLL's in his own DLL (a philosophy we disagree on). I use the vanilla zlib and minizip in Zlib Library for LabVIEW which has all the functions exposed.
-
With ZLib you just deflateInit, then call deflate over and over feeding in chunks and then call deflateEnd when you are finished. The size of the chunks you feed in is pretty much up to you. There is also a compress function (and the decompress) that does it all in one-shot that you could feed each frame to. If by fixed/dynamic you are referring to the Huffman table then there are certain "strategies" you can use (DEFAULT_STRATEGY, FILTERED, HUFFMAN_ONLY, RLE, FIXED). The FIXED uses a uses a predefined Huffman code table.
-
How to load a base64-encoded image in LabVIEW?
ShaunR replied to Harris Hu's topic in LabVIEW General
OP is using LV2019. Nice tool though. Shame they don't ship the C source for the DLL but they do have it on their github repository. -
How to load a base64-encoded image in LabVIEW?
ShaunR replied to Harris Hu's topic in LabVIEW General
Nope. It needs someone better than I. -
How to load a base64-encoded image in LabVIEW?
ShaunR replied to Harris Hu's topic in LabVIEW General
While we are waiting for Hooovah to give us a huffman decoder ... most of the rest seem to be here: Cosine Transform (DCT), sample quantization, and Huffman coding and here: LabVIEW Colour Lab -
How to load a base64-encoded image in LabVIEW?
ShaunR replied to Harris Hu's topic in LabVIEW General
There is an example shipped with LabVIEW called "Image Compression with DCT". If one added the colour-space conversion, quantization and changed the order of encoding (entropy encoding) and Huffman RLE you'd have a JPG [En/De]coder. That'd work on all platforms Not volunteering; just saying -
How to load a base64-encoded image in LabVIEW?
ShaunR replied to Harris Hu's topic in LabVIEW General
LabVIEW can only draw PNG from a binary string using the PNG Data to LV Image VI. (You'd need to base64 decode the string first) I think there are some hacky .NET solutions kicking around that should be able to do JPG if you are using Windows. -
It just didn't. Like I said. I only got the out of memory when I was trying to load large amounts of data. I suppose you could consider that a crash but there was never any instance of LabVIEW just disappearing like it does nowadays. I only saw the "insane object" two or three times in my whole Quality Engineering career and LabVIEW certainly didn't take down the Windows OS like some of the C programs did regularly. But I can understand you having different experiences. I've come to the conclusion, over the years, that my unorthodox workflows and refusal to be on the bleeding edge of technology, shield me from a lot of the issues people raise.
-
Gotta disagree here (surprise!) Take a look at the modern 20k VI monstrosities people are trying to maintain now so as to be in with the cool cats of POOP. I had a VI for every system device type and if they bought another DVM, I'd modify that exe to cater for it. It was perfect modularisation at the device level and modifying the device made no difference to any of the other exe's. Now THAT was encapsulation and the whole test system was about 100 VI's. They are called VI's because it stood for "Virtual Instruments" and that's exactly what they were and I would assemble a virtual test bench from them. Defintely 3). LabVIEW was the first ever programming language I learned when I was a quality engineer so as to automate environmental and specification testing. While I (Quality Engineering) was building up our test capabilities we would use the Design Engineering test harnesses to validate the specifications. I was tasked with replacing the Design Engineering white-box tests with our own black-box ones (the philosophy was to use dissimilar tools to the Design Engineers and validate code paths rather than functions, which their white-box testing didn't do). Ours were written in LabVIEW and theirs was written in C. I can tell you now that their test harnesses had more faults than the Pacific Ocean. I spent 80% of my time trying to get their software to work and another 10% getting them to work reliably over weekends. The last 10% was spent arguing with Engineering when I didn't get the same results as their specification That all changed when moving to LabVIEW. It was stable, reliable and predictable. I could knock up a prototype in a couple of hours on Friday and come back after the weekend and look at the results. By first break I could wander down to the design team and tell them it wasn't going on the production line . That prototype would then be refined, improved and added to the test suite. I forget the actual version I started with but it was on about 30 floppy disks (maybe 2.3 or around there). If you have seen desktop gadgets in Windows 7, 10 or 11 then imagine them but they were VI's. That was my desktop in the 1990's. DVM, Power Supply, and graphing desktop gadgets that ran continuously and I'd launch "tests" to sequence the device configurations and log the data. I will maintain my view that the software industry has not improved in decades and any and all perceived improvements are parasitic of hardware improvements. When I see what people were able to do in software with 1960's and 80's hardware; I feel humbled. When I see what they are able to do with software in the 2020's; I feel despair. I had a global called the "BFG" (Big F#*king Global). It was great. It was when I was going through my "Data Pool" philosophy period.
-
The thing I loved about the original LabVIEW was that it was not namespaced or partitioned. You could run an executable and share variables without having to use things like memory maps. I used to to have a toolbox of executables (DVM, Power Supplies, oscilloscopes, logging etc. ) and each test system was just launching the appropriate executable[s] at the appropriate times. It was like OOP composition for an entire test system but with executable modules. Additionally, crashes were unheard of. In the 1990's I think I had 1 insane object in 18 months and didn't know what a GPF fault was until I started looking at other languages. We could run out of memory if we weren't careful though (remember the Bulldozer?). Progress!
-
Tell that to Microsoft. Again. Tell that to Microsoft. I'm afraid the days of preaching from a higher moral ground on behalf of corporations is very much a historical artifact right now.
-
LabVIEWs response time during editing becomes so long
ShaunR replied to MikaelH's topic in LabVIEW General
I think, maybe, we are talking about different things. Exposing the myriad of OpenSSL library interfaces using CLFN's is not the same thing that you are describing. While multiple individual calls can be wrapped into a single wrapper function to be called by a CLFN (create a CTX, set it to a client, add the certificate store, add the bios then expose that as "InitClient" in a wrapper ... say). That is different to what you are describing and I would make a different argument. I would, maybe, agree that a wrapper dynamic library would be useful for Linux but on Windows it's not really warranted. The issue I found with Linux was that the LabVIEW CLFN could not reliably load local libraries in the application directory in preference over global ones and global/local CTX instances were often sticky and confused. A C wrapper should be able to overcome that but I'm not relisting all the OpenSSL function calls in a wrapper. However. The biggest issue with number of VI's overall isn't to wrap or not, it's build times and package creation times. It takes VIPM 2 hours to create an ECL package and I had to hack the underlying VIPM oglib library to do it that quickly. Once the package is built, however, it's not a problem. Installation with mass compile only takes a couple of minutes and impact on the users' build times is minimal. -
LabVIEWs response time during editing becomes so long
ShaunR replied to MikaelH's topic in LabVIEW General
Yes. But not a good enough reason. -
LabVIEWs response time during editing becomes so long
ShaunR replied to MikaelH's topic in LabVIEW General
You may say that but ECL alone is about 1400 Vi's. If each DLL export is a VI then realising the entire export table of some DLLs can create hundreds of VI's, alone. However. I think that probably the OP's project is OOP. Inheritance and composition exponentially balloon the number of VI's - especially if you stick to strict OOP principles. -
LabVIEWs response time during editing becomes so long
ShaunR replied to MikaelH's topic in LabVIEW General
Ooooh. What have you been doing with the icon editor? -
LabVIEWs response time during editing becomes so long
ShaunR replied to MikaelH's topic in LabVIEW General
I still use 2009 - by far the best version. Fast, stable and quick to compile. 2011 was the worst and 2012 not much better. If they had implemented a benevolent JSON primitive instead of the strict one we got, I would have upgraded to 2013. -
Steganography Version 1.0.0 is now released. Enjoy.
-
Should never do this. Anyone can sit between you and the server and decode all your traffic. The proper way is to add the certificate (public key) to a trusted list after manually verifying and checking it. What's the point of using HTTPS if you are going to ignore the security?
-
You can get some of the way like that but there are a couple of bits per byte that have to be concatenated. So at some stage you have to convert to bits. The speed of the encoding for-loops with shift was a surprise to me though. Yup. It was purely for relative performances and will not be included in the release proper. For this sort of thing I would rather deal with relative performance (removes differences between systems that they are bench-marked on). The in-built Performance>Profile Performance and Memory is better for identifying which individual components contribute what; so I would use that if someone found a better solution.
-
I don't do Discord. I don't even do Ni.com. Feedback isn't really necessary. I only knocked it up because I went down a rabbit hole and wasn't impressed with the existing LabVIEW solutions. I thought I'd throw it in here to see if someone could improve it. My solution is optimised but there may have been a better alternative solution or maybe someone had a nice JPEG one (LSB doesn't survive JPEG compression). You might get a mention in the readme just for responding
-
Nothing? No improvements. No bugs? Not even the superfluous length parameter or the example images in the wrong location? I'll give it one more week then release the version 1.0.0
-
LabVIEW 2025 installation on Ubuntu
ShaunR replied to Sam Dexter's topic in LabVIEW Community Edition
I only switched to Win10 3 years ago from Win 7 and that was only because I wanted encrypted SMB to my NAS. I'll think about desktop Linux when they fix their application distribution methods . I dropped my Linux LabVIEW product support for a reason->my products broke every time someone else updated their product. -
This all sounds very awkward for a home automation GUI. Why would you compile software on the target? Surely all the devices are wifi with their own REST API so you only need an aggregating web server with pretty javascript front end. What am I missing?
