Jump to content

shoneill

Members
  • Posts

    867
  • Joined

  • Last visited

  • Days Won

    26

Everything posted by shoneill

  1. Correct me if I'm wrong, but you can also do all of that with SVN....
  2. Nope. This situation would lead to either me or that other person getting fired because I'd call out their obvious ignorance on the subject immediately. I know about the fundamentals of human psychology and programming (which are kind of the same thing, just viewed from different angles and different problems to solve) to be willing to risk my job proving that person wrong, For me a CTO who is ignorant on the technical level of things to this extent, letting their own personal beliefs lead their technical decisions is a reason to quit anyway. So I would just accelerate the process by causing a direct confrontation.
  3. I think the longer my relationship with NI carries on, the more the message seems to be that I'm not considered part of the direction NI is concentrating on at all. In a way, the new announcement is a bit like "this isn't designed for you, do you hear us loud and clear?"
  4. You haven't offered a definition. You've made some statements using already-used words with a meaning which is (I think) different than you are trying to apply it. Hence my meme. LVRT does NOT organise program flow. That's done by the compiled code of the VI directly which is implementing the program flow. I do not know where you get your idea from. Again, I feel you have misunderstood what the LV Run-Time does. It is really comparable to the C++ Run-Time installations which exist on practically every windows system. So for me, if you agree that VC++ is not virtualised, by extension, you also agree that LabVIEW is not virtualised.
  5. I feel like you're stretching here to make similarities seem to exist when tey do not. Your argumentation for the LVRT is obviously flawed. If that were the case, every program which accesses Win32 functions is virtualised? No. It isn't. It's linking between actual compiled machine code. It is not platform-independent, it is compiled for the current machine architecture. So is VisualC++ vistualised because there are VC++ runtimes installed on Windows? To paraphrase "the Princess Bride".
  6. To make comparisons with games, you might be better off looking at the blueprint editor which is part of Unreal Engine. Feels a lot like LabVIEW. Not controlled from within the game, but it's a lot closer to the kind of thing you're trying to infer here.
  7. I did NOT know that branch prediction could have such an effect. Nice post. Thanks. That article brings some old memories to light. I remember during the old "Prime number" coding challenge on the NI site there were times where adding a dummy case to my case structures during processing produced measurable changes in execution speed. I couldn't explain it, but I accepted the improved execution times and carried on with my business. Seeing how the execution was more or less deterministic, could it be that I was moving between "good" and "bad" patterns for the case structure. IIRC, the results were tested on a Pentium 4, with a 31-stage pipeline. May be completely unrelated, but that was always a point which stuck with me that I couldn't really explain.
  8. I'll just drop a link to a post on the dark side. I made that post after doing benchmark comparisons between TDMS and SQLite (sound familiar?). It might be important to take into account that "first read" performance and "typical" performance can be orders of magnitude different. And the differences my be a function of the file type. OTOH, as mentioned in the post, the same mechanism can be (ab)used to create a 64-bit file cache when using 32-bit LabVIEW, so that's nice.
  9. <Pedantic> The horizontal can opener thing is apparently not true. The main mode of operation of opening cans was vertically, even before (especially before) the "rotating wheel" versions were invented in 1870. Yes, 1870, nearly 150 years ago. A version which removed the complete top of the can (horizontal cutting) only appeared in the 1980's. According to a wikipedia entry I met down the pub. </Pedantic>
  10. I love it when old posts come up and I simply can't remember writing them, but my name is on it, so it must have been me..... No, no tricks. Haven't done that in years. It was for simply LVLibs, no classes, no hierarchies. LLBs don't play well with classes (no sub-folders, no ability to store multiple VIs with the same name). I suppose the PPL is the proper replacement for this now, although that's not source code....
  11. Hmm, of course I run into problems at the target boundary. The Probes don't execute in the same context as the FPGA code. Using a FP control of the cluster I want to investigate (hopefully all Abstract classes have been re-assigned) I run into problems. Concrete classes become abstract classes because the abstract classes are the only ones included with the Probe (On the FP). I thought I saw yesterday that the information is passed as part of the TypeDescriptor of the Variant resulting from a conversion but I don't see that any more. I suppose this is ultimately a serialisation / deserialisation problem across context boundaries. Unfortunately, I can't use Variant or String directly on the FPGA.......
  12. I've actually gotten it to work with general refnums. One note: If the actual value of the items is not important (which is the case for me - I just want to make sure all elements in a cluster have actually been assigned) then all the information required is contained within the TypeDefinition returned when performing a "Flatten Variant to String". I've written parsers for this before, but that was like 15 years ago in LV 6.1....
  13. I got the class name part working with the NI VI GetClassName.vi. I'm going tot try to get it working with FPGA DMA channels...... lol. Wish me luck 🤪
  14. I seem to have trouble getting this working with classes. I'd like, if possible to have the name of a concrete class included as "Value. I don't know how to go about that...... Anyone got some tips?
  15. I refuse to accept the OPs idea of all people opposing his childish rant as being a "defender of OOP". I'm a defender of intelligence and critical thinking. I'm an engineer. The OP is a petulant fool. Of course some things are true, some things of OOP are good, some are bad. Judgement is required. Being intellectually lazy and understanding "OOP is a framework" is just not worth even arguing against. If he was working with me, I'd make sure the guy is fired. Dead weight. I've had several beers. This means this post is completely honest. If possibly not accurate.... ☺️
  16. It's a rubbish post. OOP is not a framework, as the author claims it is. He / she has clearly not understood what OOP is, what it's supposed to do, how it's supposed to work and when it's NOT best suited to the job. Yes, OOP needs to be used wisely. No, it's not a band-aid for people who still don't know what they're doing. And the claim that OOP does not map to the human mind..... well, darling, back when OOP was introduced, it's still better than completely unstructured code. As with all software tools, if you don't understand it, it will hurt you. Statefulness, messaging and concurrent code organisation actually has nothing to do with OOP. Yes, OOP can be used to get that working, but the two are completely unrelated topics. They can both co-exist without each other.
  17. If it remains hard to trace, try clearing your compile cache. Works wonders for me sometimes.
  18. I my test (After setting all to disable debugging) User Events are second only to Queues and Notifiers. They're 20% faster than Channel (High Speed Stream). And Notifiers and User Events are very close in performance. Sometimes User Events win, other times Notifiers, depends on which other methods are running concurrently. This is in a VM with only 2 cores.
  19. I would always recommend wiring up the output of the read nodes to an array indicator (outside the sequence so that it has no effect on timing). Compiler optimisations can do weird things when you're not actually wiring up certain outputs. For example, you're not using the data output of the variant but you are of the map. Not saying it explains the differences, but I've seen things like that wildly affect performance in the past. I would look at the code, but I'm currently trying to get 2019 installed in a VM.
  20. I'm an avid GoT fan, but I'm waiting for Season 8 to arrive..... 🤣
  21. Even God appears to agree with Shaun.... Mark 1:34 “And he healed many who were sick with various diseases, and cast out many demons. And he would not permit the demons to speak, because they knew him.”
  22. No problems. I understood from the previous post you were teaching LabVIEW as opposed to using LabVIEW to teach something else. All clear now. And either way, the community is always ready to help.
  23. Also, I don't mean to be mean, but are you sure you're in the position to create code to teach people LabVIEW? This isn't advanced LabVIEW. If this is the quality of material students will be using to learn LabVIEW it might be best not doing it at all. Again, I'm not being mean, I'm being honest (and if anything I'm being diplomatic). Don't you have access to people who have (a lot) more experience in LabVIEW to help you out?
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.