Leaderboard
Popular Content
Showing content with the highest reputation since 09/03/2023 in all areas
-
I think there were 182. Everything was video recorded and after editing will be posted on YouTube. Expect something in about six weeks.4 points
-
(Disclaimer: I am not an NI insider, and I have no inside knowledge of the pending Emerson acquisition) I think we're all sort of in a holding pattern waiting to see how the Emerson acquisition plays out. Emerson's outward messaging seems very positive towards LabVIEW, which I find encouraging.4 points
-
Nearly my entire career has been spent with employers in two industries: medical device manufacturing, and military/government communication manufacturing. I am guessing here, but I strongly suspect that this has shielded me from the effects of manufacturing, and its associated testing, being moved offshore and/or to the contract manufacturer du jour. Device and system testing where everything is traceable, and out-of-box failures are intolerable, means more investment in what would otherwise just be an expense area to the industries where yield just needs to be "good enough". Also, meddev, mil, avionics, space etc. industries are slow to adopt change, so if it was "done in LabVIEW" fifteen years prior, it probably still is. Only other industry that immediately comes to mind in this category would be some specialized instrumentation manufacturer (perhaps serving e.g., chem, biological, oil and gas) where the clever scientific/entrepreneurial types who brought that tech to market, are still in charge of things. Those folks perhaps want to see smart testing strategies and good instrumentation testing their instrumentation before it goes out the door. That's the closest I can come to suggesting a strategy for your next move, Phillip. But I'll be darned if I can find much of anything that smells like that being offered up in the incessant LinkedIn emails I get. Heck, they can't even settle on what the term "test engineer" means. Dave2 points
-
I like remixicon.com. Everything there is free for personal or commercial use. Here's what they have for a search term of "door":2 points
-
Linux only huh? No mention of fallocate? Why do people keep posting junk from ChatGPT? At this point I consider it spam.2 points
-
Hi All, I'm writing (what I hope is nice) modular code, trying to modularise functions and UIs etc accross the applications we're developing to keep things nice and flexible. But I'm struggling to keep individual modules "uncoupled" - or rather not all interdependent on each other. Example - I have module for communicating with an oscilloscope to capture waveforms. I create a typedef in that module for passing out those waveforms and related information to another module, which sends them on to a storage module and a waveform processing module (where I do various things to the raw waveform) and then finally out to a waveform display module... So my question becomes "who should own the waveform data typedef" - the easiest thing to do is make sure the input side of any module's API accepts the output of the API producing the data... so if it sits in the acquisition module library at the "source" of the data, then assuming I dont keep translating to different data types, all of my other modules end up dependent back to there too - e.g. my waveform processing and display modules now depend on the acquisition module - but I might want to write a post processing and display application that needs to know how to process waveforms, but I dont want to have to include the waveform acquisition module with that as it wont be acquiring anything... I'm sure there's lots of options, and this post is really to try and find some I havent thought of yet! I know I could create the typedef in it's own library or even outside of any library, then the users of that typdef are not dependent on each other (I think that's basically a form of dependency inversion?). But then that gets hard to manage and leads to the idea of a "common data types" library, which can quickly grow to lots of other things and more types of coupling in a way. I could translate the typedef as it passes through a chain of modules so that each module defines the way it expects to receive the data, and then a tree type module hierarchy limits the coupling... but that always feels somewhat inefficient - and I want one waveform data type, not 4 depending on where I am in the application. My situation is marginally complicated as well since we're using PPLs and some of my data types are now classes, so they need to be inside a PPL somewhere and then used from there (and its a pain we cant build a packed class without first wrapping it in a project library...) So yes - how do you all manage your typedefs and classes that get used across module boundaries to minimise those dependency issues!? Thanks in advance, sorry for the slightly rambly post!1 point
-
Just stumbled into this discussion a day or two late. Since I'm gainfully employed and developing in LabVIEW, but now 65+, I'm going to follow any further discussion. (Also, when was the last time I saw anyone make reference to HTBasic??) I do feel your level of discomfort, though, Phillip. A lot of good points made here about why LabVIEW may be slipping in its position of being a premier development environment for automated testing, data acquisition, etc. My text-based development experience (RMB, MS-C, MS-C++... and let us never speak again of TekTMS) is largely a distant memory after using LV for so many years (apart from begrudgingly still having to do a little VBA coding for some business-mandated Excel). My most recent relevant story comes from watching a 20-something coworker developing with Python for a data-gathering task in an R&D lab. He seemed to be struggling mightily with a third-party graphing library, and I noted (nicely) that in LV, setting up that charting would be relatively trivial. Bright guy that he is, he installed LV (enterprise license here at Abbott), and while I feared he'd be all lost/sullen/blame-the-tool, he remarkably picked it up far faster than I recall myself doing so. I diligently provided feedback and little bits of LV code to help him get started. His first project had all the hallmarks of a first-timer, but he seemed happy with the process. The really bizarro part of this story is the postscript; he left Abbott and is working with his father on a financial modeling project. Astoundingly, he's using LabVIEW, at least for the run-up. He says it allows his father to most easily visualize the arcane calculations and add extra inputs and outputs with a minimum of effort. Like others, I suspect that regardless of the Emerson future plans, there will be an ongoing need for LV developers even if only to maintain projects. I still can't figure out how Python developers get good integration with the various hardware interfaces needed, nor how they create good technical graphical displays, and I very much expect I'm never going to figure that out. Dave1 point
-
Hi The original question "Can I ride the LV/TS train to retirement" needs to divided into at least two cases, even if you tie them together as here. Not all are experts in both LV -and- TS. You might be an LV expert, yet know nothing about TS. And you could live fine with that. At least in the past. The other way round is also true. You might be an TS expert, yet know nothing about LV. I guess that is much less likely though. And as TS supports many adapters, you could even get away with only implementing the actual do-something-code-snippets in HTBasic. Probably hightly relevant 20 years ago. So let us 'forget' the TS experts, They will survive for the foreseeable future. The LV experts is a much more endangered species. Just being able to code -good- LV code will not save you. What will save you is having application knowledge. RF and Radar knowledge. Audio knowledge. Electric Car related knowledge. Rocket knowledge. You probably get the picture. Applications where simply calling into Python Packages has not taken over yet for one reason or the other. If you have that knowledge you might be able to be allowed to implement it in LabVIEW. What else could save the LV expert ? Well, it could be that the initial glory of Python has gotten some stains. Using unknown quality Packages is an increasing worry : I hope it helps in directing where to head towards. As a little bonus, try look up well deserved Knight of NI, Rolfk's new icon. For ages it used to be a now ancient photo of himself. Now it is something else 🤔 Regard1 point
-
Well. If I were your manager then I'd ask you to find a contractor that can do it in a week and task you with managing them. I feel this is a different point. I am a Systems Engineer and a programming language is a tool I use to engineer a system That is different from what I was saying that languages are pretty much all the same. The latter is a poke at the industry lacking diversity in it's approach to programming and that I (and others) only see a difference in syntax and not really in execution. You might argue that Ladder Programming is a different beast to C/C++ (which it is) but both of those are 50 years old. LabVIEW has more in common with Ladder programming than any of the more modern languages which is why many people struggle when they move from a text based language. There are 32 types of hammers but they are all still hammers. That's how I see programming languages.1 point
-
The point I'm trying to convey here is that people often become closely associated with their roles in their respective fields, not necessarily as a programmer in the language that dominates in that field. For instance, if I've been a front-end web developer my whole career, how does that experience help me when trying to land a job as a data scientist? Saying that both positions use text-based programming languages doesn't cut it. If you ask me, the biggest problem with LabVIEW isn't LabVIEW itself but rather the fact that the field it excels in, such as electronics testing and measurement, isn't as thriving as many other software-centric fields that exist today. Our company has real-time test systems built with NI/LabVIEW. And we still develop new ones! Such systems aren't very common around here, with or without LabVIEW, compared to the plethora of companies making apps and other software solutions. Which wasn't the case 10/20 years ago. I am more worried about our company scrapping electronics testing altogether than moving away from LabVIEW. 😅1 point
-
They pretty much are, at the end of the day. When you program in windows, you are programming the OS (win32 API or .NET). It doesn't really matter what language you use but some are better than others for certain things. It's similar with Linux which has it's own ecosystem based on the distributed packages. Where LabVIEW differs is in the drivers for hardware and that is where the value added comes from. The only other platforms that have a similar hardware ecosystem is probably something like Arduino.1 point
-
When I said at least one, I meant whichever one that I feel would be more beneficial to my own career ambitions, be it Python, Rust, etc. I didn't (mean to) infer that they were generic and interchangeable, (I know better than that). For me though, my introduction and programming experience has primarily been in LabVIEW. Changing my mental process to go from a graphical to a text based language feels like it would be more difficult for me than switching between text-based languages. As I said though, I realize that there are differences ranging from simply syntax to the more drastic.1 point
-
I always find this kind of question difficult because LabVIEW is just my preferred language. If an employer said we are switching to something else, I'd just shrug my shoulders, load up the new IDE and ask for the project budget number to book to. I think the issue at present isn't really if there is demand for the next 10 years. It's what Emerson will do at the end of this year. You could find that LabVIEW and Test Stand is discontinued this year, let alone in 10.1 point
-
Can somebody try and help me program this process I want to make in the most simplest way you can (Kinda like with event structures) please. I need it before the 18th of September. Avance 3 (1).vi1 point
-
I use the "Advanced PNG Export" feature of https://pictogrammers.com/library/mdi/. This allows some customisation of icon (size, colour, transparency, border) before downloading.1 point
-
Icons are the resource sink of many development teams. 😀 Everybody has their own ideas what a good icon should be, their preferred style, color, flavor, emotional ring and what else. And LabVIEW actually has a fairly extensive icon library in the icon editor since many moons. You can't find your preferred icon in there? Of course not, even if it had a few 10000 you would be bound to not find it, because: 1) Nothing in there suits your current mood 2) You can't find it among the whole forest of trees1 point
-
my usual sources : - https://fonts.google.com/icons - https://pictogrammers.com/library/mdi/1 point
-
Technically related question: Insert bytes into middle of a file (in windows filesystem) without reading entire file (using File Allocation Table)? (Or closer, but not that informative). The extract is - theoretically possible, but so low level and hacky that easy to mess up with something, rendering the whole system inoperable. If this doesn't stop you, then you may try contacting Joakim Schicht, as he has made a bunch of NTFS tools incl. PowerMft for low level modifications and maybe he will give you some tips about how to proceed (or give it up and switch to traditional ways/workarounds).1 point
-
I finally got the 32-bit shared object built on Linux. After 3 hours of fiddling I simply re-installed gcc-multilib and it just worked. 😡 I have pushed the changes to github. After testing, this will be v3.0.0 because I expect the static linking to be a breaking change in some instances. The next step is to compile a large list of test cases to confirm that the new operators are behaving as expected under as many corner cases as we can think of.1 point
-
A 100kB view will not help you truncate from the front. You can use it to copy chunks like mcduff suggested but The issue with what the OP is asking is to get the OS to recognise a different start of a file. Truncating from the end is easy (just tell the file system the length has changed). From the front is not unless you have specific file system operations. On Windows you would have to have Sparse Files to achieve the same as fallocate1 point
-
1 point
-
Yes. Looks like your fix was added to release v2.3.5 of muparser. I will rebuild the dlls and prepare another release for testing.1 point
-
It is actually much faster on my machine. Here are a few results: @Łukasz Fast solution: ~30 µs @cordm Case 1 (really slow): ~403 µs Case 2 (good performance and readability): ~54 µs -- output is wrong, see below. Case 3 (): ~235 µs Case 4 (original solution): ~30 µs Case 5 (LV200000_BLASLAPACK.dll): ~14 µs Case 6 (LVBLAS.dll:BLASCopyVectorH): ~16 µs -- Windows 11, LabVIEW 2020 SP1 (32-bit) This code actually truncates the last value because the length of the source array becomes odd. Here are two possible fixes. The second one is slightly faster for me. 1) Append the final element: ~60 µs. (slightly slower than before) 2) Rotate the string before conversion: ~42 µs.1 point
-
1 point
-
1 point
-
Hello, LAVA. My team at SpaceX is looking for LabVIEW developers. We have two job reqs open, one for entry-level developer and one for senior. Ground Software is the mission control software for all Falcon and Dragon flights. Every screen you see in the image below is running LabVIEW. Our G code takes signals off of the vehicles, correlates it for displays across all our mission control centers and remote viewers at our customer sites and NASA. It's the software used for flight controllers to issue commands to the vehicles. This is the software that flies the most profitable rockets in the world, and we're going to be flying a lot next year and in the years to come. If you'd like to get involved with a massively distributed application with some serious network requirements, please apply. You can help us build a global communications platform, support science research, and be one of the stairsteps to Mars. Entry level: https://boards.greenhouse.io/spacex/jobs/6436532002?gh_jid=6436532002 Senior level: https://boards.greenhouse.io/spacex/jobs/6488107002?gh_jid=64881070021 point
-
Indeed, if you never connect the cable, it will crash. I have now move away from ftdi. I found a much better solution. Microchip MCP2221A Also a version for SPI is available. Amazing IC with a lot more capability for a fraction of the price. We develop our own PCB to interface it and it still become cheaper that the cable from FTDI.1 point
-
1 point
-
Hello Benoit. Thank you for your library ! But I have done few modifications to be able to use it : I2C Device Read.vi : - cluster "options" changed ("Acknowledge bit" renamed "Not Acknowledge bit", "Not Acknowledge bit" and "Reserved bit" inverted) - conversion from and to string for "Data" deleted => Be able to receive also bytes with a value of 0. MPSSE I2C.zip1 point
-
Here are the VIs we use for Windows authentication and domain groups. Validate Username and Password.vi takes the username and password and returns a TRUE if it validates against the domain controller. User in Group.vi takes a username (or current user if left blank) and a Domain Group name and returns TRUE if the user is a member. User Groups.vi takes a username (or current user if left blank) and returns an array of Domain Groups to which the user belongs. We only use these on our internal network, so I can't guarantee they work in every situation. Still, they may give you a starting point if you need something similar. Pat P.S. LabVIEW 2010sp1 User Groups.vi User in Group.vi Validate Username and Password.vi1 point
-
Lately I've been reading up on data flow languages in general and Labview's execution system and discovered I have been thinking about things all wrong. Chunks of code without data dependencies can execute in parallel, so over time I have come to associate that parallelism as meaning multiple OS threads are used. It turns out that's not the case. Multiple threads may be used, but are not necessarily used. So that leads to my question. If they're not threads, what do we call each arbitrary unit of code that is free from external data flow dependencies? We often refer to parallel "loops," but parallelism isn't always contained in a loop. Labview documentation refers to groups of sequential operations as "clumps." Those are artifacts of the compiler, not the dev environment, so I'm not sure that's a good name either. "Branch" is a likely candidate. We talk about branching wires all the time. We don't typically talk about a branch as a unit of execution. Plus "branch" tends to refer to what you see on the bd, not an arbitrary chunk of data flow. In the diagram it is obvious SubVI1 is a branch, SubVI2 is a branch, and SubVI3 + SubVI4 is a branch. (Assuming none of the sub vis obtain external data from references.) Notice that SubVI1 + SubVI2 + Add primitive is also a branch, but the term "branch" doesn't seem (to me) to naturally apply to that group of code. Thoughts? Ideas?1 point