-
Posts
1,172 -
Joined
-
Last visited
-
Days Won
106
Content Type
Profiles
Forums
Downloads
Gallery
Everything posted by Neil Pate
-
[Discuss] JKI Please Wait Dialog
Neil Pate replied to Michael Aivaliotis's topic in Code Repository (Certified)
Have you tried to mass compile? -
Nice article also, part 1 of 7! https://arstechnica.com/science/2021/01/the-curious-observers-guide-to-quantum-mechanics/
-
Are you sure you burned the correct amount of sage during the incantation?
-
I think the NXG team was outsourced to a C# dev house outside of the US. I suspect the contract has been terminated.
-
Maybe 2021 will be the year of Linux on the desktop, but given that this has been predicted every year since about 2003 I would not hold my breath. LabVIEW will never be open source. It would not make sense, nobody outside of NI would be able to maintain it. LabVIEW is not the Linux kernel which is of huge interest to millions of others. The area of the intersection on the Venn diagram of skilled enthusiastic LabVIEW developers, ultra-skilled C++ developers, and those with enough spare time is approximately zero. The full complement of engineers at NI can barely make any progress into the CAR list, what hope is there for anyone else?
-
Which is the best platform to teach kids programming?
Neil Pate replied to annetrose's topic in LabVIEW Community Edition
Another option is Alice. It combines a 3D environment with a language quite similar to some of the others mentioned. It can be fun for younger kids as they can ignore the programming bit and just construct a 3D scene with some props quite easily. -
Dear Santa NI I am now in my 40s with youngish kids, so despite the fact that all I got for Christmas this year was a Pickle Rick cushion I am not actually complaining. However, I would like to get my order in to the Elves as early as possible. This is my wishlist, in no particular order. I expect this list will not be to everyone's taste, this is ok, this is just my opinion. Make LabVIEW free forever. The war is over, Python has won. If you want to be relevant in 5 to 10 years you need to embrace this. The community edition is a great start but is is probably not enough. Note: I accept it might be necessary to charge for FPGA stuff where I presume you license Xilinx tools. NI is and has always been a hardware company. Make all toolkits free. See the above point. Remove all third party licensing stuff. Nobody makes any money from this anyway. Encourage completely open sharing of code and lead by example. Take all the software engineering knowledge gained during the NXG experiment and start a deep refactor of the current gen IDE. Small changes here please though... we should not have to wait 10 years. Listen to the feedback of your most passionate users during this refactor. NXG failed because you ignored us and just assumed we would consume whatever was placed in front of us. I am talking about the people like those reading this post on Christmas day and their spare time because they are so deeply committed to LabVIEW My eyes are not what they used to be, so please bring in the NXG style vector graphic support so I can adjust the zoom of my block diagram and front panel to suit accordingly As part of the deep refactor, the run-time GUI needs to be modernised. We need proper support for resizable GUIs that react sensible to high DPI environments. Bring the best bits of NXG over to current gen. For example the dockable properties pane. (Sorry not much else comes to mind) Remove support for Linux and Mac and start to prune this cross compatibility from the codebase. I know this is going to get me flamed for eternity from 0.1 % of the users. (You pretty much made this decision for NXG already). Windows 10 is a great OS and has won the war here. Get rid of the 32-bit version and make RT 64-bit compatible. You are a decade overdue here. Add unicode support. I have only needed this a few times, but it is mandatory for a multicultural language in 2021 and going forward Port the Web Module to Current Gen. All the news I have heard is that the Web Module is going to become a standalone product. Please bring this into Current Gen. This has so much potential. Stop adding features for a few years. Spend the engineering effort polishing. Fix the random weirdness we get when deploying to RT Open source as many toolkits as you can. Move the Vision toolkit over to OpenCV and make it open source Sell your hardware a bit cheaper. We love your hardware and the integration with LabVIEW but when you are a big multiple more expensive than a competitor it is very hard to justify the cost. Allow people to source NI hardware through whatever channel makes most sense to them. Currently the rules on hardware purchasing across regions are ridiculous. Bring ni.com into the 21st century. The website is a dinosaur and makes me sad whenever I have to use it Re-engage with universities to inspire the next generation of young engineers and makers. This will be much easier if the price is zero Re-engage with the community of your most passionate supporters. Lately it has felt like there is a black hole when communicating with you Engineer ambitiously? What does this even mean? The people using your products are doing their best, please don't patronise us with this slogan. Take the hard work done in NXG and make VIs into a non-binary format human readable so that we can diff and merge with our choice of SCC tools Remove all hurdles to hand-editing of these files (no more pointless hashes for "protection" of .lvlibs and VIs etc) Openly publish the file formats to allow advanced users to make toolkits. We have some seriously skilled users here who already know how to edit the binary versions! Embrace this, it can only help you. Introduce some kind of virtualenv ala Python. i.e. allow libraries and toolkits to be installed on a per-project basis. (I think this is something JKI are investigating with their new Project Dragon thing) For the love of all that is holy do not integrate Git deeply into LabVIEW. Nobody wants to be locked into someone else's choice of SCC. (That said, I do think everyone should use Git anyway, this is another war that has been won). That is about it for now. All I want is for you guys to succeed so my career of nearly 20 years does not need to be flushed down the toilet like 2020. Love you Neil (Edited: added a few more bullets)
- 30 replies
-
- 16
-
G Interfaces for LabVIEW 2020
Neil Pate replied to Aristos Queue's topic in Object-Oriented Programming
Ditto to what Rolf said. -
Any reccomendation for tools to get started with OOP?
Neil Pate replied to Matt_AM's topic in Object-Oriented Programming
Sounds... Complicated. I honestly think OOP designs like this are generally more trouble than the return you get from it. -
NI abandons future LabVIEW NXG development
Neil Pate replied to Michael Aivaliotis's topic in Announcements
Removal of Run Continuously. Vomit. I lost my voice trying to explain how removing this is a totally unnecessary modification. -
Poll: Should the CLA Exam require applied knowledge of OOP?
Neil Pate replied to Mike Le's topic in LabVIEW General
Interesting question. I would say yes due to the fundamental nature of OOP but what worries me is the natural next step then is to say that The Actor Framework is also something a CLA should have applied knowledge of, and that would not be sensible. -
But I am actually grateful for necro-ing this thread as I had not seen it first time and was interested to read it.
-
Is a baked-in Actor Framework on the drawing board?
Neil Pate replied to Bob W Edwards's topic in Object-Oriented Programming
I think you might be in the minority here. How about I offer you shared variables instead? -
Is a baked-in Actor Framework on the drawing board?
Neil Pate replied to Bob W Edwards's topic in Object-Oriented Programming
At this point we should be looking to remove things from Core LabVIEW, not adding them. LabVIEW already carries way too much baggage. I am strongly against NI trying to push YAAF (yet another actor framework) onto us. -
NI abandons future LabVIEW NXG development
Neil Pate replied to Michael Aivaliotis's topic in Announcements
Conversely, it feels like NXG was built by devs not actually intimately familiar with LabVIEW. -
Ah Lena, bet you never expected such fame. A little bit of light reading here.
-
NI abandons future LabVIEW NXG development
Neil Pate replied to Michael Aivaliotis's topic in Announcements
I think this is a great decision. Admitting they made a mistake is a bold and courageous step. Onwards and upwards from here. -
Have you tried any kind of dependency checker? This can be useful in tracking down why a DLL is not getting loaded with LabVIEW. https://github.com/lucasg/Dependencies Now, I have come across one super weird issue last year which totally surprised me. I don't think this is the same thing you are experiencing, but see this thread for an explanation https://forums.ni.com/t5/LabVIEW/error-loading-lvanlys-dll-in-Labview-64-bits/td-p/4009772
-
What FPGA card are you using? Can you share your project and VIs? One thing to point out, I would be a little surprised if you can get 1 MHz from your timed loop on the PC. However, this is not your problem though as it would manifest as a not quite perfect sine wave on the output and would not explain a phase shift.
-
I think you need to start with a simpler example. (And sorry I mistakenly thought you were using RT, you are using an FPGA card in a PC, right?) Try and make the most simple scenario you can think of. A simple VI generating a single point of the triangle wave at a time. Transfer this value to the FPGA but wire this to all the analogue outputs at the same time. If you still have a phase shift then something really weird is going on. It has been a while since I used a PC based FPGA card, is is possible the FPGA analogue outputs have different configuration somehow in the .lvproj? Like perhaps different filters or something?
-
I suspect the problem might be you are essentially trying to do single point output from the RT side of things. The property node on the RT might look like it is doing everything at once, but I don't think it actually does update all the values as the same time. Normally you would generate a waveform by either doing all the maths on the FPGA itself or using a DMA FIFO or something similar. If you are determined to do the signal generation on the RT then try replace your 8 controls that you are using to send the points to the FPGA with a single cluster of 8 elements. This will guarantee the "atomic" transmission and might fix your phase shift.
-
Better check your sense of humour detector, I think it might be faulty.