Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation since 02/24/2023 in all areas

  1. They introduced a token for smooth lines: SmoothLineDrawing=False
    4 points
  2. And the saga continues... https://www.reuters.com/markets/deals/national-instruments-picks-fortive-keysight-challengers-emersons-bid-sources-2023-03-03/
    3 points
  3. Chasing the golden geese, it seems. Interesting stuff throughout but the real juice is at about 55mins in the "Distribution of Resources". (Spoiler, it's a small club and we aint in it). The mention of porting Unicode made me laugh heartily though . My final takeaway was too many C# coders left over from NXG.
    2 points
  4. Just mention GOOP 3 times and it will summon MikaelH - who can tell you everything about it.
    2 points
  5. Break down your complex and complicated data types into a complex and complicated architecture. FTFY
    2 points
  6. I posted an idea on the idea exchange on viewing a vim's instance VI, but it looks like that ability was added to LabVIEW at some point (by AQ?). The labview.ini token allowOpenFPOfInstanceVIs changes the behavior of the Convert Instance VI to Standard VI option to instead open the instance VI. I'd been using the convert to standard / ctrl+z method but it's very easy to forget the ctrl+z bit. This token is much more useful.
    2 points
  7. Re the main issue, this seems to be my misunderstanding about how Savepoints work (as distinct from BEGIN and ROLLBACK). Here is a relevant discussion. From that discussion, I see I should, instead of "ROLLBACK TO <Savepoint>" I should do "ROLLBACK TO <Savepoint>; RELEASE <Savepoint>;" Issue 22
    2 points
  8. Thanks for sharing! I guess I'm just trying to stay positive. Every time the future of LabVIEW is brought up here things goes so gloomy so fast! And I'm trying to wrap my head around what is true and what is nothing but mindset. It's ironic though: last months there has been so much hype around AI being used for programming, especially Googles AlphaCode ranking top 54% in competetive programming (https://www.deepmind.com/blog/competitive-programming-with-alphacode). Writing code from natural text input. So we're heading towards a future where classic "coding" could fast be obsolete, leaving the mundane tasks to the AI. And still, there already is a tool that could help you with that, a tools that has been around for 30 years, a graphical programming tool. So how, how, could LabVIEW not fit in this future?
    2 points
  9. I don't think anyone is saying that, so much, with respect to NI as a whole. But the effort and investment in NXG made LabVIEW (Classic?) the withered limb of NI, Now they have lots of C# devs who can't do jack with LabVIEW. From this seminar, it looks like this is a solution (lots of C# devs) looking for a problem (Cloudy stuff) and they see LabVIEW as a stagnant technology instead of the axiomatic driver behind their hardware it actually is. Don't get me wrong. They can very easily move into this space and do all the cloudy stuff. But their view of how this will fit together is flawed (IMO). They are viewing it purely like an IT system rolling out images (think AWS Compute-IAAS) when, in fact, those images will be highly specialised LabVIEW installations for very specific and usually custom hardware configurations. They lost Test and Measurement to Python a while ago-arguably the mainstay for LabVIEW.
    2 points
  10. OpenG made an amazing set of Array tools, many years ago. They weren't perfect but had many uses and I've recommended many times. Improvements to LabVIEW meant some of the array functions weren't well optimized. Years later I tried making a more modern Array set of tools using VIMs, giving up on Polymorphics. I posted this as a package over on VIPM.IO here. Since then I've thought about a few places where performance of my stuff could be better. Mads and I have had some discussions back and forth in this thread. But I wanted to make a separate post where others could chime in with their performance suggestions too. At the moment my Array VIMs are in LabVIEW 2018. However I think due to the potential Maps and Sets benefit, that I want to go to at least 2019. In 2020 LabVIEW added the Sorted Array subpalette with a pretty decent binary search. So for now I think LabVIEW 2020 will be what I target for the next Array VIMs package release. Any thoughts on this? I know there is a decent amount of bias in this, but Jim posted the versions of LabVIEW used on VIPM.IO and 2020 and newer covered over 75% of users. So attached is zip with a set of array testing VIs. For instance open the Remove Dups Speed VI and run it. It will run through the 6 different methods of removing duplicates from a 1D array of strings. It will then graph the different methods and do a check that they all return the same data. If you want to add your own method edit the (non-typed) enum, then duplicate a case and replace the function with your won code. It randomizes the order of the array methods used. You can also mess with the data being used. At the moment it generates 1000 unique elements, then duplicates them 5 times. If you want to enter data with none to remove, or all of a single type, or whatever then you can change the data to be used in the test. At the moment there are 8 different array speed tests to compare. Things like Delete 1D, Delete 2D, Filter 1D, Filter 1D with a Scalar, Reverse 2D, Search 1D, and the Remove Duplicates already mentioned. There might not be a single best method for a specific function. There are times one method will work better for some set of data, and then a decision needs to be made on what should go in the VIM. My main reason for making this thread is that I hope some people will know of a better way to do something. Come up with a more optimized way to do any of the OpenG array functions. After some discussion, and contribution I plan on updating the Array VIMs package and attribute those that helped. (crosspost) Edit: I just realized someone is probably yelling "Use Git" to me. I hadn't thought of that sorry, it just felt organic to continue the conversation here because it is where the topic started. 872707096_HooovahhArrayPerformanceTest.zip
    2 points
  11. This appears to be a known issue in LabVIEW 2021 SP1. https://www.ni.com/de-de/support/documentation/bugs/22/labview-2021-sp1-known-issues.html
    2 points
  12. Last week, Matlab R2023a was released. The changelog document for this (6-month cadence) release is 600 pages long!!! How come NI releases main development software with a 1-page changelog and makes it almost unusable with more issues than before??? How many NI engineers does it take to change an icon?
    1 point
  13. you can find the description of all the VIs of the toolkit here : https://www.ni.com/docs/fr-FR/bundle/labview-advanced-signal-processing-toolkit-api-ref/page/lvasptconcepts/aspt_default_page.html
    1 point
  14. Ok, this is not gonna work like this. The sample code you show uses gobjects to start its own loop and do event handling all through it. gobjects is basically a glib task handling system. It's in fact its own execution system and is for instance used as base of gtk and in extension GNOME. But these are build on top of gobjects. LabVIEW has its own execution system and marrying two of them together is a major exercise in coding. In fact it is almost never done since it is so difficult to do right. You will need to try to find the lower level interface of this aravis library. It will require you to call lots of functions with the arv_ prefix and similar, but you must avoid anything that starts with g_. Basically you would need to start writing something like IMAQdx with many of its functions but instead of calling into IMAQdx you call into the arvis library. It's doable but not for the faint of heart. Basically trying to interface to image acquisition hardware and libraries is very difficult. Always! And there is a reason these libraries cost money and there are very few freebies here. The Arvis library itself seems to be free. Its LGPL licensing can be problematic for anyone wanting to use it for a commercial program. And while it states to be LGPL there are actually files in there that state to be GPL. So licensing is not completely clear cut there. Incorporating GPL software in anything other than GPL software is basically impossible.
    1 point
  15. I'm not sure you did a service here. Trying to do callback functions through Windows messages is both not straight forward and in fact the master class of interfacing to a shared library. Considering that his shared libraries are .so files, he also obviously is not on Windows but Linux,
    1 point
  16. The best approach would be to take that code and make your own shared library from it. Export functions to Open the device, Grab Images and Close the device. Basically you do not want an executable with a main function but instead you want a shared library that seperates those indvidual functions out. Why not call the underlying library directly through the Call Library Node you may ask. Because you have a callback function in there! It may be possible otherwise but that callback really is a no go for direct LabVIEW interfacing.
    1 point
  17. Not like this Because that is the goal; break down your complex and complicated data types into simple and uncomplicated ones. For configuration data you could maintain the path to the storage location and load the data as needed.
    1 point
  18. As an aside, you should look into the Upsert clause, which allows doing INSERT or UPDATE in a single SQL statement. Also, note that you don't need savepoints about a single transaction (all single statements are their own transaction, and either succeed or rollback automatically).
    1 point
  19. This is explained in the SQLITE help pages: https://www.sqlite.org/lang_savepoint.html#savepoints
    1 point
  20. Now there is no time for experiments. When I update record, in all projects, I use template with a restore point. SQLite is quite paranoid about data integrity. But I am using old version of pushit SQLite Library v1.10.0.85 Although I think on new versions, this template will work.
    1 point
  21. Sounds like an uncommitted transaction. Make sure you have committed all transactions before closing the file. Uncommitted transactions are lost.
    1 point
  22. https://finance.yahoo.com/news/ni-acquires-set-gmbh-accelerate-140000459.html
    1 point
  23. Here's the starting point: Wait (ms) with error pass-through From what I remember at CLA Summits, this was unofficial up to 2016 (called a "macro" at this point in time). But the cat was waaaaaaaaaaay out of the bag, so NI spent time to make it a proper feature and malleable VIs started in 2017. The Type Specialization Structure was not quite ready for prime time for another year (2018).
    1 point
  24. I do love how VIMs came to be. I'm having a real hard time finding it. But there was an idea on the Idea Exchange that there should be a function that can delay any data type, similar to the OpenG Wait which takes an error in and passes it out. Jeff K. posted on the thread saying something like "Oh yeah that is a thing, you just need to use a VIM, here is an example which uses XNodes." It blew my mind. Then in the next release of LabVIEW for the Mac, Jeff K. sneaked in a new VIM on the palette which some high up in R&D didn't know, which had the type specialized structure in it, which was also unreleased. I downloaded that version just to get the VIM and structure. I get the feeling the reason VIMs seemingly came out of nowhere, is that Jeff was pushing for it for years, and then when it was mostly stable he just put it out there to get the public's response. When everyone saw the potential that he also saw in it, R&D put efforts into getting it out there. This is just my speculation from the outside.
    1 point
  25. English *is* my first language and I'm not as eloquent as you are. There is no real argument here, though. I still use LabVIEW 2009 for development because there is little that the later versions offered of significance. It's also robust, stable and fast. That cannot always be said for some of the later versions (looking at you 2011/2012). Some features that actually got us excited weren't even on the roadmap (VIM's anyone?). NI have been so far behind the curve for features that we want that we have all created our own solutions so if one of them actually gets implemented, it's a moot feature. TLS/SSL, for example was only released in LV2020 but I (and Rolf) had created solutions a decade before that. The one thing we have been yelling at NI about for about 15 years is Unicode which we cannot really make a native solution for. This is why I laughed when it was mentioned in this talk. I moved to HTML UI's and relegated LabVIEW to a back-end service through Websockets which solved the problem but it's a sledgehammer to crack a nut.
    1 point
  26. There is this interesting blog on Linkedin : https://in.linkedin.com/posts/jimkring_upgrading-to-new-labview-versions-is-for-activity-6972085040700686336-5gGw?trk=public_profile_like_view Lapsus : the person in question is CTO not CEO. Regards
    1 point
  27. You will have to convert the OpenCV MAT to LV picture. The MAT object has a ToBitmap method, which can generate a LV compatible bitmap. I attached a VI that uses the EmguCV .NET wrapper.
    1 point
  28. I've really no idea what this means. If you are just looking for a volume of any code then VIPM has hundreds of libraries (with source) that you can train on. You will have 10's of thousand of VI's with source to point your algorithm at. Apart from that, you will have to be more specific. Artificial Insemination is coming for us, boys and girls.
    1 point
  29. You'll need to define exactly what "experimental setups" means - I've no idea
    1 point
  30. Hi NI is messing with the LabVIEW compiler. A memory leak would just be the latest. They started by eliminating the Hybrid Compiler, introduced in LabVIEW 2010 SP1, in LabVIEW 2019 to make development simpler but it appears to just progress in the wrong direction for every new LV version. They also started messing with the icons when beautifying project libraries in LabVIEW 2021. They haven't solved it yet, it seems. See NI Forum thread NI Library Icon problems in classes. Is there a development pattern here .. Regards
    1 point
  31. SubVIs that are called as a function, and don't have the terminals change value after entering the VI, should have the terminals on the root of the diagram, not in sub diagram structures. This is because the compiler can't know if these terminals changed value from the last time they were used, and so it will read them again. If it is on the root of the diagram it reads it when it enters the VI and never needs to read it again. Same with indicators. These should be on the root of the diagram and I think the CLD takes off points if it isn't. https://forums.ni.com/t5/LabVIEW/Community-Nugget-Wired-Terminals-in-Subdiagrams-Clearing-up-the/td-p/2093252 But it is a very minor thing, I just mentioned it as something I'd change, but not something I would expect to affect memory. I worked at Magna Electronics, Magna E-Car, and I think Magna Powertrain was in there somewhere as divisions changed and were absorbed. Making validation and verification test systems for various automotive components like running boards, inverters, chargers, power control modules, and cameras. Good times until it wasn't. I knew this was related because it gave a loading warning that VIs were loaded from a new path, and the old path had Magna in it.
    1 point
  32. Thank you codcoder, I really appreciate your response. Cheers,
    1 point
  33. Vistek provides a linux sdk, you could go the hard way and wrap it (I have no experience with this particular SDK, though I've managed to wrap others of other camera vendors across the years)... In case v4l talks to it, maybe adapting this .... Good luck....
    1 point
  34. Are you asking about taking the exam at a physical location or online? I've taken both the CLD and CLD-R online and I can recommend it. It worked fairly well. And the CLAD is still just multiple choice questions right? BUT regarding the online exam NI is apparanelty moving to a new provider so you can't take it right now: https://forums.ni.com/t5/Certification/NI-Certification-Transition-to-a-New-Online-Exam-Delivery-System/td-p/4278030
    1 point
  35. You'll need OpenG Zip 4.1.0, which includes support for Linux RT (afaik this isn't available through VIPM, only from the forum). Once installed, use MAX to install a 'legacy' custom software image on your sbRIO, and include the OpenG ZIP Tools package: Alternatively the required library file found in C:\Program Files (x86)\National Instruments\RT Images\OpenG ZIP Tools\4.1.0 can be copied to your sbRIO: You can also configure a build spec in the LabVIEW project to auto deploy that library, which is mentioned in the linked thread.
    1 point
  36. How to overcome this error? By doing what the error messages says.
    1 point
  37. What is it they say about the last 10% of a project is 90% of the work? I stand by my original statement "What a crappy, poorly thought-out protocol" So. Like I said earlier, 1/2 the state is in the client and 1/2 in the server. I'll add now "they have to agree". So what if they don't agree? Well, here we go. Client and server maintain a list of packets (Packet Identifier). Ok. Can do that. OK. But what if I don't get the PUBREC? I can send a duplicate right? Sweet Oh wait. Ok So we save them up for connect only. Job done, right? Ummm. So we can only send duplicates at connect (and only if it's a persistent session) and there can only be so many unacknowledged ACK's otherwise we get spanked. So what happens if the unacknowledged packets reaches the limit? Really? Go silent or hope for the best depending on the whim of the developer? Hmmm. So what does this look like in the real world. EMQX: 2 hours before getting Receive Maximum exceeded. Hive: 20 minutes before getting Receive Maximum exceeded. Mosquitto: 1 hour before getting Receive Maximum exceeded. That's no good. Lets send duplicates if we have some acks and are getting near the limit (that's what we should be doing IMHO) EMQX: 40 minutes before getting getting disconnected (Packet Identifier in use). Can't blame them. Crappy protocol. Hive: 24 hrs and still going but getting duplicates in the subscriber (that's not right for QOS 2 but could live with it.) Mosquitto: 24 hrs. and still going. No problems. We have a winner! This is a doozy too: Fix the state (normally that would be with duplicates), restart (without persistent session-poor subscribers ) or blame someone else. Am I missing something?
    1 point
  38. With a slightly snarky tone, I want to ask if this is part of the 100 year business plan NI has. On a personal level I just hope LabVIEW can stay relevant until retirement. I do still have a perpetual license to 2022 Q3, which supports Windows 11. So even if NI goes away I'll be able to be in my language of choice until 11 is no longer supported. LabVIEW has changed the way I think about programming in such a way that I think it is hard to go to other languages. My brain thinks in parallel paths, and data dependence, not lines of code and single instructions. Whenever I develop in C++ I can't help but feel how linear it is. I'm sure higher level languages are better, but at the same time I don't really want to change. As long as I can work at a place that needs test applications, and doesn't care how they are developed, I'll be happy pushing LabVIEW. The fog of the future is hard to see though. The next year or two looks very uncertain in my career. But looking at the past, working in LabVIEW has felt like winning the lottery. Thinking about this helps me stay positive.
    1 point
  39. Hi everyone, I was forced to insert google (or equivalent) maps in a project and, for some time, I used Mohammad Garousi version (that I found the most complete version freely available on the web). It uses the GMap.NET dlls. After some time I noticed a recurring random issue at dll level, after which a red cross appears in the .net container without any chance to correct or avoid the error. After some time I decided to create a new map container that might solve the issue. I decided to use a picture instead of the .net container. I was able to add new functions too, like creating a route, customize its behavior (such as changing the route color to spectrum), or add another image moving with map position or zoom change and so over. The result is attached below. This is a full-working example that can be studied or modified at will, but still a demo and not a library or final release. User can move map dragging it inside the picture, can zoom with the slider and so on. Coordinates and cursor positions are updated live any time the cursor is in the picture area. You can draw a route enabling the draw button. Remember to set it off when drawing is over. You can add a third party image (the default one is a drone image) in the picture 0,0 position. You can easily customize the default position acting on the block diagram code. Any time you set the image button to on, the image will be placed to the 0,0 position, recalculating its latitude-longitude coordinates. You can dynamically change map representation, route color or zoom. I hope this example can be useful for those who need a more stable GIS representation. If any issues are discovered, please contact me. Bye Flavio LabGIS.rar
    1 point
  40. Technically it is all passed around by pointer, which is synonymous to by reference. Logically that makes no difference whatsoever as it all happens under the hood. Data going into a subVI through the connector into a control is consumed by that subVI and considered to be private to the VI. If LabVIEW needs that data somewhere else, it will make a copy, but it has optimizing measures in place that may schedule different functions consuming the same data in a way that they can not stomp on each others data. Data coming through an indicator out of a subVI is from that point owned by the caller and the VI better doesn't try to change it after the fact. For LabVIEW VIs that is not a possibility at all as LabVIEW takes care of that. For data returned from external code through a Call Library Node, it is a grave violation of the LabVIEW memory management contract to try to modify the data in the background once the Call Library Node call has returned control to the diagram. LabVIEW is a managed environment just like .Net but has different management rules. As far as LabVIEW diagrams are concerned there is virtually no way to violate those rules since the dataflow determines everything. If you interface to external code then you could violate that management contract in the external code but doing so is equivalent to placing a bomb in your PC! 💣
    1 point
  41. I am very excited about the potential for a platform that encourages opensource collaboration on LabVIEW code. My main experience of non LabVIEW package managers is with NPM for Node.js. NPM is an organisation which provides two things - a tool which is the mechanism for managing what packages are used in a project and a registry that allows for anyone to publish their packages to. I believe that NPM supports private packages for enterprise customers but open source packages are generally hosted on github and when a package is uploaded to the NPM registry it simply pulls in the README to provide the package documentation. The github link is also provided on the NPM page so that users can easily see where the library comes from and if they want to open issues or submit fixes then they do that on github. I have not had much of a chance to look at it but it appears like GPM would/does follow similar mechanics to NPM and compared to VIPM and NIPM I am certainly most excited about the GPM model. I see GCentral as a organisation that could provide the registry for packages and ideally be the one place for opensource LabVIEW code (including NI-community page hosted code) with clear signposts as to where to find the source for issue raising and forking. One issue that many text based languages don't have is that users with older versions of labVIEW cannot even open code made with newer versions of LabVIEW, let alone run it - so maybe GCentral could provide some computing power (and licences) to automatically convert VIs to older versions - even if they didn't run, at least a user could open them.
    1 point
  42. I also have a method to propose: U64 Nanoseconds to LabVIEW Timestamp.vi
    1 point
  43. Version 0.24.0

    8,860 downloads

    lava_lib_LabVIEW_API_scripting_tools v0.22.1.21 by University of Leeds Author: Gavin Burnell Copyright: Copyright © 2007-2010, University of Leeds License: BSD Compatible LabVIEW Versions: >= 8.6. Compatible OS Versions: ALL. Description: This is a LabVIEW 8.6.1 Library of VIs to help with scripting. The public VIs include routines to get hold of the block diagram references, control terminal references, get the connector pane reference and select the connector pane pattern and wire controls up to it. There are a number of routines to help wire the block diagram up, including creating a selection of the primitives (I got bored of coding them all up !). I've stuck in some routines that work with tags - hidden away is the capability to tag LabVIEW vi-server objects with arbitary data. One possibility is to use this to identify bits of the block diagram of a vi for moving and rewiring via scripting. The scripting Tools includes a separate XNode support library that provides routines to help scripting and terminal adaptation in XNodes. There are some routines to help with undo transactions new to this release. This Package Conflicts with these other packages: LAVAG_scripting_tools >= 0.0 Scripting Tools >= 0.0
    1 point
  44. Is there a native way of determining what the localized decimal separator is? I have a combo box which I'm hacking up to be a pseudo-numeric control, insomuch as I want to be able to enter any positive real value, or allow the user to select from a set of pre-determined special values which are text. So I populate the Strings[] of the combo box with the relevant strings, and leave it to the user to enter any numeric values on their own. When the value changes, I parse the value for a numeric if it's not one of the matched strings, then re-write the parsed value back to the control to make sure whatever is being displayed matches with the value I'm tracking on my data wire. Works pretty well. The only problem is the user can of course still enter anything they want in there. If I have the strings "Fee" and "Fie" available, the user can enter "Foe" by typing it in. This doesn't cause a problem since I parse and write back to the control, but I'd like to be able to have it be more like a numeric control, where you can't enter invalid characters. Try to enter any non-numeric related character into a numeric control, you can't do it. I could easily implement this via the Key Down? event structure frame, but how do I distinguish what a valid decimal character is? In a North American locale I'd allow a "." character, but in France I wouldn't, for example. I'm aware of the %.; %,; and %; codes which help with scanning an entire string for a number, but in this case I'm trying to match only a single character and I don't think they are of much help.
    1 point
  45. Dynamic dispatch can have an inheriting class (child object) override the VI (http://en.wikipedia.org/wiki/Method_overriding). Static dispatch prevents override by the inheriting class.
    1 point
  46. I came across this thread: http://lavag.org/top...qr-code-or-not/ and decided to dust off my old QR code generator, clean it up and see if it would be useful. I had no access to the formal spec-sheet, my info came from a combination of wikipedia, textbooks on Golais fields (fun stuff), and a lot of reverse engineering. As I remember things I am trying to actually document the code this time around. QR Generator_v2.llb
    1 point
  47. You asked for it. A little cleaning went a long way, just did not have a chance to document things very well (have to jog my memory for that). This was built with piecemeal documentation, mostly to the original iso-spec, and with a lot of reverse engineering. Seems to work as well as my iPhone reader, hard to tell which one is the problem in a few corner cases. Not sure if I want to be on the hook for documentation/support if I posted to the CR, maybe a NI community page is in order, or Mr Mike will sort his out and add to the page. By all means let me know of any bugs, certainly a lot of version/EC combinations have not been tried. QR Generator.llb
    1 point
  48. Darin.K made the suggestion of being able to extract sub-arrays by specifying the start and end of the subset, rather than the start and length as LabVIEW allows. I had written this XNode a while back, so have just tidied it up. ToDo: - Accept 2D (and greater) arrays - Make growable for multiple subarray outputs (would also be useful for the builtin Array Subset) Requires: LabVIEW 8.6.1+ Gavin Burnell's Scripting Tools (invaluable!) Index Array Subset.zip
    1 point
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.