Jump to content

bsvingen

Members
  • Posts

    280
  • Joined

  • Last visited

Posts posted by bsvingen

  1. I just know by experience that my old programs using GOOP and call by ref vi's, simply get cleaner and "tighter" and more modular when using LVOOP and Actor principles (after getting familiar with actor oriented design). IMO LabVIEW is in many ways the "native programming language version" of Simulink or Ptolemy. Where Simulink and Ptolemy is built on top of some other languages to be Actor Oriented simulation environments, LabVIEW is sort of built from the ground to be an Actor Oriented visual programming language (I don't know if it actually is, but I pretend it is, and it works real good ;) ). LabVIEW is also dataflow, but the main limitation with that is not the dataflow itself (call by value), but the implicit synchronizing and serialization caused by the wires (easy to work around but still a limitating factor), and none of the GOOPs changes any of that, they just change a data-wire for a ref-wire.

    Anyway, I like to think like this. LabVIEW simulates instruments. It is in all respect a 100% configurable software version of a concurrent embedded system of processes that manipulates any kind of data in any way possible, a virtual instrument. Just like Simulink and Ptolemy can be used to simulate anything, so can LabVIEW. Simulink and Ptolemy forces you to think actor oriented rather than object oriented (in the traditional meaning) or procedural, because there is no other way. LV does not force you to to that, but if you do LV really shines and so do LVOOP. In this respect LVOOP is a completely new concept altogether, it is an object oriented version of actor oriented design. I am not sure if that is entirely correct, but it works for me much better than GOOP.

  2. C++ is no problem, but you have to make wrappers. I am using C++ for lots of stuff:

    http://wiki.lavag.org/DLL/shared_library

    http://forums.lavag.org/C-classes-from-LV-t12905.html

    Depending on the complexity of the C++ classes and code, it could be done very fast, or it could be much more difficult. You only need to make wrappers for the stuff you need, and make sure that the wrappers get access to the C++ code.

  3. I suddenly discovered that I never use GOOP any more. Somehow LVOOP and queues seem to do the job better, with better consistency, less coding and with less clutter than any GOOP could ever do. LVOOP is also faster for some reason. The only by ref objects I have are made in C++, mostly due to complex algorithms not easely made in G, not due to by ref programming paradigm.

    Anyone else experiencing this?

  4. It was mainly just something I was wondering about. You can get some info here. In the early 90s there were alot of development in mathematical packages using C++ due to the OOP functionality, but lots of it simply died due to performance penalty, ranging from 20% at best, to a factor of 10 compared with FORTRAN (or sometimes C programmed in "FORTRAN fashion"). Templates even the table a bit (lots actually) while maintaining most of the flexibility of OOP.

  5. One drawback with OOP i the loss of execution speed. In C++ templates can often be used instead of inheretence, offering the same/similar functionality in most cases, and with no loss of speed. Could this be something for LVOOP? Or will this simply be the same as a polymorphic vi? Can this be done with LVOOP classes?

  6. I mostly find labelling wires to be confusing and difficult to work with. They don't really show any information that is not already there in most cases, unless you have really long wires. They are free labels, meaning the automatic wiring tools will route wires around them. All in all, I have stopped using them entirely.

    An alternative method is to use the "Description and tip..." when rightklicking on the wire. Open the online help, Ctrl-h, and now you see the "label" or description when your mouse is hoovering above the wire. An added bonus is that wire branches will also have the same description. This is a much better solution IMO.

  7. QUOTE (Val Brown @ Feb 4 2009, 09:41 PM)

    I think that, again, you make one of my main points. What differentiates Matlab from LV is that Matlab is TEXT! So, yes, the text-based "crowd" really likes it and it is taught in a number of CS programs alongside standard C/C++ courses. LV is visual and far more intuitive than any text-based programming constructs UNLESS you've been "brought up" to expect text.

    I think it is more to it than that. There is something called algorithms, and this is where LabVIEW falls bang to the floor. LV is good at expressing data flow, but terrible at expressing the algorithms operating on that data. FORTRAN is extremely good and incredible efficient at it, while C/C++ is more than adequate and extremely flexible. Matlab is also extremely good at it, but inefficient enough to be nothing more than a university tool. I mean, how would a general purpose matrix solver look in G? completely incomprehendable, that's how.

    A side note. Downloaded that Scilab-LV thing along with Scilab. There is a new version from desember 2008, but it seems broken, giving a runtime error. Anyone else got it to work?

  8. Java is not open source. It is proprietary and closed, but free (as in free beer). The difference between java and LV from a "high level commercial" point of view, is in fact price only. But there exist another programming environment, Matlab, that is general purpose in nature and much more expensive than LV, still it has a much larger user base. With Matlab it is also possible to make stand alone executables, yet extremely few use Matlab to make general purpose programs.

    There is also a Matlab open source "clone" called Scilab. It even has a LabVIEW node that can be used. Everything is open source and free and supported by NI it seems. I tried it a few months back, and it seems to work just perfectly (if I only could find some real world use for it ;) ).

  9. QUOTE (Mark Yedinak @ Feb 3 2009, 09:03 PM)

    But until NI does a better job at promoting and supporting LabVIEW as a general purpose programming language it will not be seen as one. I think this is the cruxt of this entire debate.

    I am sure if they did promote and license LV as a general purpose language the same way Java is licensed, the user base would expand tremendously to put it mildly. We can only hope, some day :)

  10. With a bit of theoretical work, you can show that any programming paradigm is "perfect" for just about anything. This doesn't mean that a specialized programming language like LV is a general purpose language just because it uses dataflow. On the other hand, Java is indeed a general purpose object oriented language like C++, but Java is abselutely useless for making low level stuff like drivers for instance. I mean, programming paradigm is only one of several pieces that makes a language "tick" in the real world.

    I use LV for lots of things as well, but I have grown more and more sceptical using it for general purposes. The cost is one major issue. If you use it in the lab with equipment costing several millions, this cost is easely justfied. As a general purpose language, the cost is way too high, when looking at the alternatives which cost nothing (Java, C/C++ etc.)

    I think most people using LV as a general purpose language, started using it in the lab. Then they got more and more professional with it, and also started using it for other tasks as well (why use a month learning Qt and some more C++/Java when I can do this in one evening using LV? right?) This is all very fine, but sooner or later, if your code base develops, you will get into practical problems with license cost, portability, maintainability, not necessary for yourself but for others and for future development and certainly for future distribution. These are problems that naturally surfaces due to the fact that you are using a highly specialized proprietary programming language. You cannot even legally read the source without a valid license. For a general purpose language this is a big NO NO. These are real world problems that you simply cannot get around untill NI start licensing LV the same way Sun is licensing Java (which doesn't seem to happen any time soon).

    Anyway, I'm not saying "don't use LV". I think LV is excellent. But it is not a general purpose language, not in any stretch of the word. If you use it as a general purpose language, you should be aware that you could face unsolvable problems in the future.

  11. QUOTE (jdunham @ Feb 2 2009, 06:16 AM)

    Dude, the real strength of LabVIEW is the intuitive nature of dataflow programming and the awesome task scheduling engine that allows multithreaded program execution with hardly any management code required. It's an excellent signal processing and control environment. For me it's so much more than an add-on for manipulating NI hardware.

    Yes, but everything about LabVIEW is tuned towards dataflow. This is intuitive for datalogging where data is in fact flowing or streaming, but counterintuitive for just about everything else where no data is flowing, or at least you would not naturally think in those terms. Classic GOOP is more intuitive in most situations, because the data has now stopped flowing, it is only changing based on the reference that is flowing. LVOOP is somewhat more tuned towards dataflow than ordinary dataflow, because in addition to the data flowing, the functions and knowledge of what can be done to that particular data is also flowing (dynamic dispatch).

    Another thing that is counterintuitive and IMO counterproductive for everything except streaming data, is the left-to-right paradigm. The natural thing to do when you can program in two dimensions, is to fully use those dimensions to build abstract programming topologies or topologies representing real world topologies where data and messenges flows in all directions. This also requires objects with persistent datastorage in the form of GOOP (or more precisely reentrant LV2 style globals) rather than LVOOP. Complex programs can be built easely using this "paradigm", because the topology inherently within all software is simply laid out on the screen. This can be done today in LV, but it is cluttered and confusing because of the left-to-right paradigm, and becomes utterly counterintuitive for someone thinking only in dataflow.

    LabVIEW is similar to FORTRAN. They both can be used to make any kind of program, but they are both really only good at doing one single thing; datalogging and number crunching respectively. Like FORTRAN, the reason for this is not inherent shortcomings in the language structure, but because they both are tuned towards those specific tasks.

  12. The original question was why aren't there any open source alternatives to LabVIEW. Please correct me if I am wrong, but as far as I know NI-DAQmx still supports .NET, VB, Visual C/C++, C# and even Matlab straight out of the box (with the DAQ toolbox in Matlab). Drivers for using C in LINUX is also supported. I know many laboratory personell that have used BASIC with NI-DAQ (the old version) with NI hardware for decades.

    What would an open source initiative add to the world of DAQ? Data Acquisition is inherently hardware based, and will allways be. I have allways thought that the real strength of NI is the NI-DAQ drivers, enabling the use of NI hardware for just about every thing possible. LabVIEW is just an add-on enabling easy or fast use of NI hardware.

    LabVIEW has grown way beyond being just an easy eaccess to NI-DAQ, but as a general programming language it simply falls short compared with text based programming with few exceptions. It is good for two things: data acquisition and user interfaces, but it is utterly closed, you can't even look at the source without installing LabVIEW.

    An open source labview is certainly doable from a DAQ point of view, all the drivers are there. The "perfect labview" IMHO, would be a native by-ref object based version with seemless integration with C++, more as a user interface builder for C++. But then again, this is more or less exactly what Qt is, it is open source, works on all operating systems, and there should be no problems using NI-DAQ with it :rolleyes:

  13. QUOTE (Michael_Aivaliotis @ Jan 21 2009, 10:53 PM)

    So far I still haven't heard any valid arguments against releasing it in NI Labs. As the young people say: "All of them are made of FAIL and are pretty lame". Nice try though.

    ;) Maybe so, but there aren't many valid arguments for releasing scripting here either. From a pragmatic point of view, it is pretty obvious that NI lacks arguments for releasing it, but they have tons of arguments against it. What exactly those arguments are and if they are valid, makes no difference relating to the simple fact that they lack valid arguments for it.

  14. Download File:post-4885-1232315590.zipUsing C++ classes in LV is something that pops up now and then. Searching the net is almost fruitless, untill I found this site. I simplified it a bit to better understand what was going on, and called it from LV. I put it here with source and a compiled DLL in case someone else is searching for days to find out how to do this.

    The clue is of course to built a C wrapper and call it from LV, but that's not the main clue. The main clue is to make the wrapper opaque to the C++ code by typedefing a struct using the class, and pass a pointer to the struct. This way none of the C++ class code need to be touched. I think this can only be done in C++, not in plain C, because C++ doesn't differentiate between classes and structs, and plain C does not have classes.

    With this wrapper, the C++ classes can be used in principle like any other GOOP classes.

  15. I am wondering if someone can tell how to create new toolkit what do I need,

    Cause I would like to develop an open source PID controller toolkit.

    I have thought about this too. What i have come to is that in 90 % of the cases, all you need in an app is a straight forward PID controller with a minimum of "bells and whistles". In addition you probably in some cases need an integrator and a derivator. The two latter things already exist in the math library in LabVIEW, and a simple PID is made in 5-10 minutes. If you have a good PID, then the code repository on this site is a better place IMO.

    If you focus on "what is possible to do" instead of "what do most of us actually need and want", then there really is no limit to the complexity you can make out of this toolkit, and you end up making a simulation toolkit al

  16. No, but there are several reasons for wanting to have this ability. A current one that I am running into is declassification and reuse of LabVIEW code that was developed on a DoD classified computer system. Now for text based languages you can print out the code or look at it with a text editor and verify that there are no classified numbers or information, but currently with LabVIEW this is impossible. There are a few software tools that exist that can automate this process for text based code, but non (that I am aware of) that can do this for LabVIEW.

    If we could convert VIs to XML, we could run the XML through one of these tools and then convert back to LabVIEw, or at least allow the code to be released.

    IMO this is a typical example of something that requires a lot of work and a lot of time, still - all you could realistically hope to acheive (within the constraints of a challenge) is a tool that does this on a small subset of all the functions that exist. Besides, such a tool already exist (by the looks of it, at least some bits and pieces) in the "...vi.lib\_script\XML Scripting" folder in LV8.2

    What would be more interresting, and also a lot easier and faster, is to write a compiler that makes VIs out of a textual language. Then you are free to make the language syntax and style anyway you like, without the restriction that it has to be a textual representation of the VI.

    Nevertheless, i'm much more for challenges that most of us actually has time to do, since the usefulness of any code resulting from a challenge will be very limited at best. I think a good challenge is not one that pushes the limits, but rather one that sparks off creative solutions.

  17. I remember i had some labview courses back in 1993, 94 or 95 (i dont remember exactly), but as of today i dont have a clue of what they consisted of :rolleyes:

    Anyway, if have programmed in at least one other language before, and know some basics about logging, measurements and analysis, i think you will get very fast up to an adequate level by just working through some examples on your own (as i *think* i remember, the cources did not go past any of the examples that can be found in the installation, or in these days - online).

    But it depends on what you are going to do. If you are not planning to use labview in the future, then in your case, you will probably be better off with some SCADA system that has all the capabilities built in (you dont have to make them yourself, which involves learning labview).

  18. I dont have labview on this computer, but the fibonacci number is very easy to implement iteratively in labview. Here is a C version from wikipedia

    unsigned fib(unsigned n) {  int first = 0, second = 1;  // this while loop will exit when the int reaches 0  // it is equivalent to while(n != 0) { n-- .... }  while (n--)	{	  unsigned tmp = first+second;	  first = second;	  second = tmp;	}  return first;}

    Just make a while loop with three shift registers, one to hold first, one to hold second and one to hold n (well, just make a LV version of the C code).

    Fibonacci numbers are the typical example of recursion because they are always written recursively. But to implement a "standard library fibonacci routine" with recursion is complete insanity because the iterative version is so much more effective. F(100) would probably take years (seriously and literaly) to calculate recursively on a PC (coded in C), while it takes some nano seconds iteratively.

    When the recursive routine is simply a straight linear iteration written recursively, then there are no difference in efficiency because both routines scale as O(N) (under the assumption that the cost of the recursive call itself is zero), and therefore a recursive routine will become more compact and "to the point". In LV the cost of a recursive call is far from zero, and the only extra thing needed compared with a recursive version, is an extra shift register in the loop.

  19. There are some things here that just cant be standing uncorrected :P

    First: F(0) = 0 , not 1

    Jace, your procedure in your last post does not calculate the fibonacci numbers, just try with the number 4 as input.

    While recursion can be simple to implement (considering the language support it), it can be incredible slow if implemented naively on a non-optimizing compiler. For instance, a naive recursive implementation of the fibonacci number scales as O(2^n) while a iterative procedure scales with O(n). This means that to calculate F(32) requires approximately 32 operations when implemented iteratively and 4294967296 operations when implemented recursively. Even when implemented recursively and optimized it will take at least 1024 operations (scales as O(n^2)). See wikipedia for more.

    For other things like trees, that do not have this terrible scaling problems, then i do not see why recursion should be simpler than using shift registers. When the procedure is of order O(N), this means that the function can be used as is, also in an iterative procedure.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.