Wire Warrior Posted April 9, 2011 Report Share Posted April 9, 2011 Yesterday day at work our software manager, a long time text based programmer, can into the office where we sat and declared, "You win. I admit it." We asked what he meant and he said that he just realized that he had programmed graphically in his text based language (FORTH). It seems that he was doing some coding for a piece of test code for an embedded system that would place 16 values on a bus and toggle a read/write line for each bus value. Desiring to keep the code as simple as possible, i.e. no loops, he simply coded it direcly like the pseudo-code below: Write value 1 to busToggle lineWrite value 2 to busToggle line....Write value 16 to busToggle line Since the toggling of the line was being control by direct setting of a registry bit, he realized that he could simply make a macro to lay down that code. Of course now he needed a name for the macro, and since FORTH would let him use any characters other than colon, period, space, & a couple of similar ones and he desired that it should be clear that the code was test only he chose to name it "_/~\_" for easy identification. (This is a square wave in ASCII) The compiler was of course happy with this since one binary pattern is as good as another to a computer. He continued along happy completing the inital version of his code then broke for lunch quite happy with himself for his creative naming. At this time I do have to say he has taken LabVIEW Basics I & II so that he can assist with code reviews but preferred text based programming over graphical. For time to time we (the LV programmers) and he/them (the text based programmers) have engaged in good natured ribbing for fun. Back to the story, after lunch our happy text based programmer sits down with his test code to continue development. Looking at his code he thinks to himself, "how clever I am to have used a picture of a square wave for my macro's name." It is at this point he realizes that he has started graphical programming just like LabVIEW. Seeing that he doesn't want to wait until WE realize this during a code review or debug session because that would give us too much fuel for the ribbing, he opts for the only honorable thing......."Admit that graphical programming wins." After having seen so many "LabVIEW Sucks" type of posts over the years, I just had to share this story. One interesting though we had in talking about this though.....to the computer using a series of bits read as text by humans to represent a "block of code" is no different than using a series of bits read as a picture by humans to represent a "block of code". The difference ultimately exists in the mind of the programmer not in the language/computer constructs. Jason 1 Quote Link to comment
SteveChandler Posted April 10, 2011 Report Share Posted April 10, 2011 If you feel like giving some ribbing positive feedback you can quote Alfred Whitehead on the importance of a good notation. By relieving the brain of all unnecessary work, a good notation sets it free to concentrate on more advanced problems, and in effect increases the mental power of the race. He could have been writing about LabVIEW instead of mathematics. Of course any programming language is a graphical language at least to the extent that it uses math. The meaning of the four glyphs "40+2" is the same in English, Chinese, Russian and French. LabVIEW just takes this to a higher level. When I am programming in some text based language I often wonder how much more difficult (or at least inconvenient) it must be for someone who does not speak English to use that programming language Quote Link to comment
Black Pearl Posted April 10, 2011 Report Share Posted April 10, 2011 (edited) Thanks for that positive story. Actually it is hard to believe the attitude of text-based programmers looking at a graphical code as a kind of 'toy'. I work in a very thight interdisciplinary team. When we talk about electronics, we look at the schematics and not at text. If we need a mechanical construction, we also look at drawings. And whenever we do design some optics, we have beams, surfaces and lenses drawn. And when something is measured/characterized, it's also graphically plotted. So really weired to do text-based software. One interesting though we had in talking about this though.....to the computer using a series of bits read as text by humans to represent a "block of code" is no different than using a series of bits read as a picture by humans to represent a "block of code". The difference ultimately exists in the mind of the programmer not in the language/computer constructs. Here I would disagree. There is a very important difference between data-flow languages and all other languages. At first, this is a bit hidden in the compiler (there are some good explanations in wikipedia, and of course some pretty detailed issues posted mainly by AQ): In a text based language I can write i=i+1 where the data before and after this operation is stored at the same memory location. In LV you can't. Each wire (upstream and downstream of the +1 prim) is pointing to a unique memory location. This data flow paradigm has sever consquences: Parallelism is nativ We have a duality of by-val (native) and by-ref, while text based compilers are limited to pointer :throwpc:/by-ref. Felix Edited April 10, 2011 by Black Pearl 1 Quote Link to comment
SteveChandler Posted April 10, 2011 Report Share Posted April 10, 2011 Thanks for that positive story. Here I would disagree. There is a very important difference between data-flow languages and all other languages. At first, this is a bit hidden in the compiler (there are some good explanations in wikipedia, and of course some pretty detailed issues posted mainly by AQ): In a text based language I can write i=i+1 where the data before and after this operation is stored at the same memory location. In LV you can't. Each wire (upstream and downstream of the +1 prim) is pointing to a unique memory location. This data flow paradigm has sever consquences: Parallelism is nativ We have a duality of by-val (native) and by-ref, while text based compilers are limited to pointer :throwpc:/by-ref. Felix I think the point is about the representation of the bits in a meaningful way to us humans. I completely agree that the difference between dataflow and everything else is huge! But dataflow and parallelism can be unintuitively implemented in any text based language. They are obviously much more difficult to see at a glance. But I think the philosophical principle of Linguistic Relativity applied to programming languages is more interesting than what happens after the compiler gets done with the code. Quote Link to comment
PaulG. Posted April 10, 2011 Report Share Posted April 10, 2011 ... When we talk about electronics, we look at the schematics and not at text. If we need a mechanical construction, we also look at drawings. And whenever we do design some optics, we have beams, surfaces and lenses drawn. And when something is measured/characterized, it's also graphically plotted. ... Felix When I first saw LV these were my very first thoughts. It made complete sense. Quote Link to comment
crelf Posted April 12, 2011 Report Share Posted April 12, 2011 When we talk about electronics, we look at the schematics and not at text. If we need a mechanical construction, we also look at drawings. And whenever we do design some optics, we have beams, surfaces and lenses drawn. And when something is measured/characterized, it's also graphically plotted.So really weird to do text-based software. I like that! Quote Link to comment
Ton Plomp Posted April 16, 2011 Report Share Posted April 16, 2011 Here I would disagree. There is a very important difference between data-flow languages and all other languages. At first, this is a bit hidden in the compiler (there are some good explanations in wikipedia, and of course some pretty detailed issues posted mainly by AQ): In a text based language I can write i=i+1 where the data before and after this operation is stored at the same memory location. In LV you can't. Each wire (upstream and downstream of the +1 prim) is pointing to a unique memory location. NO! If you program in such a way that the wire before the +1 primitive isn't branched at all the LabVIEW compiler will reuse that memory location. Likewise it's important to remember that if you are adding a scalar and an array using the '+' primitive you need to wire the array to the upper input of the +, the compiler tries to use the same memory location for the output as for the upper input (-/* likewise). Ton Quote Link to comment
Black Pearl Posted April 17, 2011 Report Share Posted April 17, 2011 If you program in such a way that the wire before the +1 primitive isn't branched at all the LabVIEW compiler will reuse that memory location. You are right in this simplified example. The compiler can optimize this code. But the conclusion (parallelism) is implying a branching (into multiple threads!). Propably I get's a bit clearer if I extend the example: j=i+1 foo(i) i=j Seems to be perfectly sane if i,j are ints. If they are pointer (e.g. to objects), we already have a race-condition desaster. Furthermore, I see no way a compiler could prevent this. The programmer will need to take care (e.g. foo(i) my_i=i.clone //this should happen before returning -> caller might change the object in the next line return ) Felix Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.