Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation since 03/02/2023 in all areas

  1. They introduced a token for smooth lines: SmoothLineDrawing=False
    4 points
  2. And the saga continues... https://www.reuters.com/markets/deals/national-instruments-picks-fortive-keysight-challengers-emersons-bid-sources-2023-03-03/
    3 points
  3. One could argue that LabVIEW has reached such a state of complete perfection that all is left to do is to sort out the kinks and that those minuscule fixes are what there is left to discuss! 😄
    2 points
  4. I think Andrey won't mind if I repost this here.
    2 points
  5. Chasing the golden geese, it seems. Interesting stuff throughout but the real juice is at about 55mins in the "Distribution of Resources". (Spoiler, it's a small club and we aint in it). The mention of porting Unicode made me laugh heartily though . My final takeaway was too many C# coders left over from NXG.
    2 points
  6. Just mention GOOP 3 times and it will summon MikaelH - who can tell you everything about it.
    2 points
  7. Break down your complex and complicated data types into a complex and complicated architecture. FTFY
    2 points
  8. I posted an idea on the idea exchange on viewing a vim's instance VI, but it looks like that ability was added to LabVIEW at some point (by AQ?). The labview.ini token allowOpenFPOfInstanceVIs changes the behavior of the Convert Instance VI to Standard VI option to instead open the instance VI. I'd been using the convert to standard / ctrl+z method but it's very easy to forget the ctrl+z bit. This token is much more useful.
    2 points
  9. Re the main issue, this seems to be my misunderstanding about how Savepoints work (as distinct from BEGIN and ROLLBACK). Here is a relevant discussion. From that discussion, I see I should, instead of "ROLLBACK TO <Savepoint>" I should do "ROLLBACK TO <Savepoint>; RELEASE <Savepoint>;" Issue 22
    2 points
  10. Thanks for sharing! I guess I'm just trying to stay positive. Every time the future of LabVIEW is brought up here things goes so gloomy so fast! And I'm trying to wrap my head around what is true and what is nothing but mindset. It's ironic though: last months there has been so much hype around AI being used for programming, especially Googles AlphaCode ranking top 54% in competetive programming (https://www.deepmind.com/blog/competitive-programming-with-alphacode). Writing code from natural text input. So we're heading towards a future where classic "coding" could fast be obsolete, leaving the mundane tasks to the AI. And still, there already is a tool that could help you with that, a tools that has been around for 30 years, a graphical programming tool. So how, how, could LabVIEW not fit in this future?
    2 points
  11. I don't think anyone is saying that, so much, with respect to NI as a whole. But the effort and investment in NXG made LabVIEW (Classic?) the withered limb of NI, Now they have lots of C# devs who can't do jack with LabVIEW. From this seminar, it looks like this is a solution (lots of C# devs) looking for a problem (Cloudy stuff) and they see LabVIEW as a stagnant technology instead of the axiomatic driver behind their hardware it actually is. Don't get me wrong. They can very easily move into this space and do all the cloudy stuff. But their view of how this will fit together is flawed (IMO). They are viewing it purely like an IT system rolling out images (think AWS Compute-IAAS) when, in fact, those images will be highly specialised LabVIEW installations for very specific and usually custom hardware configurations. They lost Test and Measurement to Python a while ago-arguably the mainstay for LabVIEW.
    2 points
  12. OpenG made an amazing set of Array tools, many years ago. They weren't perfect but had many uses and I've recommended many times. Improvements to LabVIEW meant some of the array functions weren't well optimized. Years later I tried making a more modern Array set of tools using VIMs, giving up on Polymorphics. I posted this as a package over on VIPM.IO here. Since then I've thought about a few places where performance of my stuff could be better. Mads and I have had some discussions back and forth in this thread. But I wanted to make a separate post where others could chime in with their performance suggestions too. At the moment my Array VIMs are in LabVIEW 2018. However I think due to the potential Maps and Sets benefit, that I want to go to at least 2019. In 2020 LabVIEW added the Sorted Array subpalette with a pretty decent binary search. So for now I think LabVIEW 2020 will be what I target for the next Array VIMs package release. Any thoughts on this? I know there is a decent amount of bias in this, but Jim posted the versions of LabVIEW used on VIPM.IO and 2020 and newer covered over 75% of users. So attached is zip with a set of array testing VIs. For instance open the Remove Dups Speed VI and run it. It will run through the 6 different methods of removing duplicates from a 1D array of strings. It will then graph the different methods and do a check that they all return the same data. If you want to add your own method edit the (non-typed) enum, then duplicate a case and replace the function with your won code. It randomizes the order of the array methods used. You can also mess with the data being used. At the moment it generates 1000 unique elements, then duplicates them 5 times. If you want to enter data with none to remove, or all of a single type, or whatever then you can change the data to be used in the test. At the moment there are 8 different array speed tests to compare. Things like Delete 1D, Delete 2D, Filter 1D, Filter 1D with a Scalar, Reverse 2D, Search 1D, and the Remove Duplicates already mentioned. There might not be a single best method for a specific function. There are times one method will work better for some set of data, and then a decision needs to be made on what should go in the VIM. My main reason for making this thread is that I hope some people will know of a better way to do something. Come up with a more optimized way to do any of the OpenG array functions. After some discussion, and contribution I plan on updating the Array VIMs package and attribute those that helped. (crosspost) Edit: I just realized someone is probably yelling "Use Git" to me. I hadn't thought of that sorry, it just felt organic to continue the conversation here because it is where the topic started. 872707096_HooovahhArrayPerformanceTest.zip
    2 points
  13. This appears to be a known issue in LabVIEW 2021 SP1. https://www.ni.com/de-de/support/documentation/bugs/22/labview-2021-sp1-known-issues.html
    2 points
  14. It's better than a new icon..
    1 point
  15. "I’m as mad as hell and I’m not gonna take this anymore!"
    1 point
  16. Last week, Matlab R2023a was released. The changelog document for this (6-month cadence) release is 600 pages long!!! How come NI releases main development software with a 1-page changelog and makes it almost unusable with more issues than before??? How many NI engineers does it take to change an icon?
    1 point
  17. The error message is telling you that your "Axis" enum control doesn't have an item called "X" (upper case) You can edit the item list of your enum by right clicking on it on the front panel and select "Edit Items...." or change or simply remove the "X" in the case structure selector value on the block diagram
    1 point
  18. you can find the description of all the VIs of the toolkit here : https://www.ni.com/docs/fr-FR/bundle/labview-advanced-signal-processing-toolkit-api-ref/page/lvasptconcepts/aspt_default_page.html
    1 point
  19. Ok, this is not gonna work like this. The sample code you show uses gobjects to start its own loop and do event handling all through it. gobjects is basically a glib task handling system. It's in fact its own execution system and is for instance used as base of gtk and in extension GNOME. But these are build on top of gobjects. LabVIEW has its own execution system and marrying two of them together is a major exercise in coding. In fact it is almost never done since it is so difficult to do right. You will need to try to find the lower level interface of this aravis library. It will require you to call lots of functions with the arv_ prefix and similar, but you must avoid anything that starts with g_. Basically you would need to start writing something like IMAQdx with many of its functions but instead of calling into IMAQdx you call into the arvis library. It's doable but not for the faint of heart. Basically trying to interface to image acquisition hardware and libraries is very difficult. Always! And there is a reason these libraries cost money and there are very few freebies here. The Arvis library itself seems to be free. Its LGPL licensing can be problematic for anyone wanting to use it for a commercial program. And while it states to be LGPL there are actually files in there that state to be GPL. So licensing is not completely clear cut there. Incorporating GPL software in anything other than GPL software is basically impossible.
    1 point
  20. I'm not sure you did a service here. Trying to do callback functions through Windows messages is both not straight forward and in fact the master class of interfacing to a shared library. Considering that his shared libraries are .so files, he also obviously is not on Windows but Linux,
    1 point
  21. The best approach would be to take that code and make your own shared library from it. Export functions to Open the device, Grab Images and Close the device. Basically you do not want an executable with a main function but instead you want a shared library that seperates those indvidual functions out. Why not call the underlying library directly through the Call Library Node you may ask. Because you have a callback function in there! It may be possible otherwise but that callback really is a no go for direct LabVIEW interfacing.
    1 point
  22. I've put together a wrapper to the Plasmionique Modbus Master library that makes communication with PLCs a bit easier. It could be used for just about any Modbus device, but I had PLCs in mind when I was developing it. What is easier? - Modicon-style addressing instead of data address and function code. - Data-type conversion built-in(eg. 32-bit float mapped to two U16 registers). Word swapping option. - Poll list and cached data block. Allows you to poll a range of registers then lookup and convert the values elsewhere. The code is up on GitHub: https://github.com/rfporter/Modbus-PLC-Client Its a rough draft at the moment. Comments are welcome and appreciated.
    1 point
  23. Not like this Because that is the goal; break down your complex and complicated data types into simple and uncomplicated ones. For configuration data you could maintain the path to the storage location and load the data as needed.
    1 point
  24. As an aside, you should look into the Upsert clause, which allows doing INSERT or UPDATE in a single SQL statement. Also, note that you don't need savepoints about a single transaction (all single statements are their own transaction, and either succeed or rollback automatically).
    1 point
  25. This is explained in the SQLITE help pages: https://www.sqlite.org/lang_savepoint.html#savepoints
    1 point
  26. Now there is no time for experiments. When I update record, in all projects, I use template with a restore point. SQLite is quite paranoid about data integrity. But I am using old version of pushit SQLite Library v1.10.0.85 Although I think on new versions, this template will work.
    1 point
  27. Sounds like an uncommitted transaction. Make sure you have committed all transactions before closing the file. Uncommitted transactions are lost.
    1 point
  28. https://finance.yahoo.com/news/ni-acquires-set-gmbh-accelerate-140000459.html
    1 point
  29. I do love how VIMs came to be. I'm having a real hard time finding it. But there was an idea on the Idea Exchange that there should be a function that can delay any data type, similar to the OpenG Wait which takes an error in and passes it out. Jeff K. posted on the thread saying something like "Oh yeah that is a thing, you just need to use a VIM, here is an example which uses XNodes." It blew my mind. Then in the next release of LabVIEW for the Mac, Jeff K. sneaked in a new VIM on the palette which some high up in R&D didn't know, which had the type specialized structure in it, which was also unreleased. I downloaded that version just to get the VIM and structure. I get the feeling the reason VIMs seemingly came out of nowhere, is that Jeff was pushing for it for years, and then when it was mostly stable he just put it out there to get the public's response. When everyone saw the potential that he also saw in it, R&D put efforts into getting it out there. This is just my speculation from the outside.
    1 point
  30. There is this interesting blog on Linkedin : https://in.linkedin.com/posts/jimkring_upgrading-to-new-labview-versions-is-for-activity-6972085040700686336-5gGw?trk=public_profile_like_view Lapsus : the person in question is CTO not CEO. Regards
    1 point
  31. You will have to convert the OpenCV MAT to LV picture. The MAT object has a ToBitmap method, which can generate a LV compatible bitmap. I attached a VI that uses the EmguCV .NET wrapper.
    1 point
  32. I've really no idea what this means. If you are just looking for a volume of any code then VIPM has hundreds of libraries (with source) that you can train on. You will have 10's of thousand of VI's with source to point your algorithm at. Apart from that, you will have to be more specific. Artificial Insemination is coming for us, boys and girls.
    1 point
  33. You'll need to define exactly what "experimental setups" means - I've no idea
    1 point
  34. Hi NI is messing with the LabVIEW compiler. A memory leak would just be the latest. They started by eliminating the Hybrid Compiler, introduced in LabVIEW 2010 SP1, in LabVIEW 2019 to make development simpler but it appears to just progress in the wrong direction for every new LV version. They also started messing with the icons when beautifying project libraries in LabVIEW 2021. They haven't solved it yet, it seems. See NI Forum thread NI Library Icon problems in classes. Is there a development pattern here .. Regards
    1 point
  35. SubVIs that are called as a function, and don't have the terminals change value after entering the VI, should have the terminals on the root of the diagram, not in sub diagram structures. This is because the compiler can't know if these terminals changed value from the last time they were used, and so it will read them again. If it is on the root of the diagram it reads it when it enters the VI and never needs to read it again. Same with indicators. These should be on the root of the diagram and I think the CLD takes off points if it isn't. https://forums.ni.com/t5/LabVIEW/Community-Nugget-Wired-Terminals-in-Subdiagrams-Clearing-up-the/td-p/2093252 But it is a very minor thing, I just mentioned it as something I'd change, but not something I would expect to affect memory. I worked at Magna Electronics, Magna E-Car, and I think Magna Powertrain was in there somewhere as divisions changed and were absorbed. Making validation and verification test systems for various automotive components like running boards, inverters, chargers, power control modules, and cameras. Good times until it wasn't. I knew this was related because it gave a loading warning that VIs were loaded from a new path, and the old path had Magna in it.
    1 point
  36. Thank you codcoder, I really appreciate your response. Cheers,
    1 point
  37. Vistek provides a linux sdk, you could go the hard way and wrap it (I have no experience with this particular SDK, though I've managed to wrap others of other camera vendors across the years)... In case v4l talks to it, maybe adapting this .... Good luck....
    1 point
  38. Are you asking about taking the exam at a physical location or online? I've taken both the CLD and CLD-R online and I can recommend it. It worked fairly well. And the CLAD is still just multiple choice questions right? BUT regarding the online exam NI is apparanelty moving to a new provider so you can't take it right now: https://forums.ni.com/t5/Certification/NI-Certification-Transition-to-a-New-Online-Exam-Delivery-System/td-p/4278030
    1 point
  39. With a slightly snarky tone, I want to ask if this is part of the 100 year business plan NI has. On a personal level I just hope LabVIEW can stay relevant until retirement. I do still have a perpetual license to 2022 Q3, which supports Windows 11. So even if NI goes away I'll be able to be in my language of choice until 11 is no longer supported. LabVIEW has changed the way I think about programming in such a way that I think it is hard to go to other languages. My brain thinks in parallel paths, and data dependence, not lines of code and single instructions. Whenever I develop in C++ I can't help but feel how linear it is. I'm sure higher level languages are better, but at the same time I don't really want to change. As long as I can work at a place that needs test applications, and doesn't care how they are developed, I'll be happy pushing LabVIEW. The fog of the future is hard to see though. The next year or two looks very uncertain in my career. But looking at the past, working in LabVIEW has felt like winning the lottery. Thinking about this helps me stay positive.
    1 point
  40. Technically it is all passed around by pointer, which is synonymous to by reference. Logically that makes no difference whatsoever as it all happens under the hood. Data going into a subVI through the connector into a control is consumed by that subVI and considered to be private to the VI. If LabVIEW needs that data somewhere else, it will make a copy, but it has optimizing measures in place that may schedule different functions consuming the same data in a way that they can not stomp on each others data. Data coming through an indicator out of a subVI is from that point owned by the caller and the VI better doesn't try to change it after the fact. For LabVIEW VIs that is not a possibility at all as LabVIEW takes care of that. For data returned from external code through a Call Library Node, it is a grave violation of the LabVIEW memory management contract to try to modify the data in the background once the Call Library Node call has returned control to the diagram. LabVIEW is a managed environment just like .Net but has different management rules. As far as LabVIEW diagrams are concerned there is virtually no way to violate those rules since the dataflow determines everything. If you interface to external code then you could violate that management contract in the external code but doing so is equivalent to placing a bomb in your PC! 💣
    1 point
  41. I also have a method to propose: U64 Nanoseconds to LabVIEW Timestamp.vi
    1 point
  42. Version 1.0.0.1

    213 downloads

    This VI allow an easy way to create HTML color table in the LabVIEW report generation toolkit. This VI should be connected to the "Append User Formatted HTML to Report.vi". For this version, all the column are the same size. But alignment and color of each cell can be defined. Please rate this file if you download it. It's free... you can at least give some second of your time to rate it... I spend some hours to make it... Enjoy
    1 point
  43. Is there a native way of determining what the localized decimal separator is? I have a combo box which I'm hacking up to be a pseudo-numeric control, insomuch as I want to be able to enter any positive real value, or allow the user to select from a set of pre-determined special values which are text. So I populate the Strings[] of the combo box with the relevant strings, and leave it to the user to enter any numeric values on their own. When the value changes, I parse the value for a numeric if it's not one of the matched strings, then re-write the parsed value back to the control to make sure whatever is being displayed matches with the value I'm tracking on my data wire. Works pretty well. The only problem is the user can of course still enter anything they want in there. If I have the strings "Fee" and "Fie" available, the user can enter "Foe" by typing it in. This doesn't cause a problem since I parse and write back to the control, but I'd like to be able to have it be more like a numeric control, where you can't enter invalid characters. Try to enter any non-numeric related character into a numeric control, you can't do it. I could easily implement this via the Key Down? event structure frame, but how do I distinguish what a valid decimal character is? In a North American locale I'd allow a "." character, but in France I wouldn't, for example. I'm aware of the %.; %,; and %; codes which help with scanning an entire string for a number, but in this case I'm trying to match only a single character and I don't think they are of much help.
    1 point
  44. Dynamic dispatch can have an inheriting class (child object) override the VI (http://en.wikipedia.org/wiki/Method_overriding). Static dispatch prevents override by the inheriting class.
    1 point
  45. I came across this thread: http://lavag.org/top...qr-code-or-not/ and decided to dust off my old QR code generator, clean it up and see if it would be useful. I had no access to the formal spec-sheet, my info came from a combination of wikipedia, textbooks on Golais fields (fun stuff), and a lot of reverse engineering. As I remember things I am trying to actually document the code this time around. QR Generator_v2.llb
    1 point
  46. You asked for it. A little cleaning went a long way, just did not have a chance to document things very well (have to jog my memory for that). This was built with piecemeal documentation, mostly to the original iso-spec, and with a lot of reverse engineering. Seems to work as well as my iPhone reader, hard to tell which one is the problem in a few corner cases. Not sure if I want to be on the hook for documentation/support if I posted to the CR, maybe a NI community page is in order, or Mr Mike will sort his out and add to the page. By all means let me know of any bugs, certainly a lot of version/EC combinations have not been tried. QR Generator.llb
    1 point
  47. If you have an active internet connection, you can use this VI. It uses Datasocket and the Google Chart API. Create QR Code.vi (LV 8.6)
    1 point
  48. Darin.K made the suggestion of being able to extract sub-arrays by specifying the start and end of the subset, rather than the start and length as LabVIEW allows. I had written this XNode a while back, so have just tidied it up. ToDo: - Accept 2D (and greater) arrays - Make growable for multiple subarray outputs (would also be useful for the builtin Array Subset) Requires: LabVIEW 8.6.1+ Gavin Burnell's Scripting Tools (invaluable!) Index Array Subset.zip
    1 point
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.