Jump to content

LabVIEW for the win!


Recommended Posts

Yesterday day at work our software manager, a long time text based programmer, can into the office where we sat and declared, "You win. I admit it." We asked what he meant and he said that he just realized that he had programmed graphically in his text based language (FORTH).

It seems that he was doing some coding for a piece of test code for an embedded system that would place 16 values on a bus and toggle a read/write line for each bus value. Desiring to keep the code as simple as possible, i.e. no loops, he simply coded it direcly like the pseudo-code below:

Write value 1 to busToggle lineWrite value 2 to busToggle line....Write value 16 to busToggle line

Since the toggling of the line was being control by direct setting of a registry bit, he realized that he could simply make a macro to lay down that code. Of course now he needed a name for the macro, and since FORTH would let him use any characters other than colon, period, space, & a couple of similar ones and he desired that it should be clear that the code was test only he chose to name it "_/~\_" for easy identification. (This is a square wave in ASCII) The compiler was of course happy with this since one binary pattern is as good as another to a computer. He continued along happy completing the inital version of his code then broke for lunch quite happy with himself for his creative naming.

At this time I do have to say he has taken LabVIEW Basics I & II so that he can assist with code reviews but preferred text based programming over graphical. For time to time we (the LV programmers) and he/them (the text based programmers) have engaged in good natured ribbing for fun.

Back to the story, after lunch our happy text based programmer sits down with his test code to continue development. Looking at his code he thinks to himself, "how clever I am to have used a picture of a square wave for my macro's name." It is at this point he realizes that he has started graphical programming just like LabVIEW. Seeing that he doesn't want to wait until WE realize this during a code review or debug session because that would give us too much fuel for the ribbing, he opts for the only honorable thing......."Admit that graphical programming wins."

After having seen so many "LabVIEW Sucks" type of posts over the years, I just had to share this story.

One interesting though we had in talking about this though.....to the computer using a series of bits read as text by humans to represent a "block of code" is no different than using a series of bits read as a picture by humans to represent a "block of code". The difference ultimately exists in the mind of the programmer not in the language/computer constructs.

Jason

  • Like 1
Link to comment

If you feel like giving some ribbing positive feedback you can quote Alfred Whitehead on the importance of a good notation.

By relieving the brain of all unnecessary work, a good notation sets it free to concentrate on more advanced problems, and in effect increases the mental power of the race.

He could have been writing about LabVIEW instead of mathematics.

Of course any programming language is a graphical language at least to the extent that it uses math. The meaning of the four glyphs "40+2" is the same in English, Chinese, Russian and French.

LabVIEW just takes this to a higher level. When I am programming in some text based language I often wonder how much more difficult (or at least inconvenient) it must be for someone who does not speak English to use that programming language

Link to comment

Thanks for that positive story.

Actually it is hard to believe the attitude of text-based programmers looking at a graphical code as a kind of 'toy'.

I work in a very thight interdisciplinary team. When we talk about electronics, we look at the schematics and not at text. If we need a mechanical construction, we also look at drawings. And whenever we do design some optics, we have beams, surfaces and lenses drawn. And when something is measured/characterized, it's also graphically plotted.

So really weired to do text-based software. :frusty:

One interesting though we had in talking about this though.....to the computer using a series of bits read as text by humans to represent a "block of code" is no different than using a series of bits read as a picture by humans to represent a "block of code". The difference ultimately exists in the mind of the programmer not in the language/computer constructs.

Here I would disagree. There is a very important difference between data-flow languages and all other languages.

At first, this is a bit hidden in the compiler (there are some good explanations in wikipedia, and of course some pretty detailed issues posted mainly by AQ):

In a text based language I can write

i=i+1

where the data before and after this operation is stored at the same memory location.

In LV you can't. Each wire (upstream and downstream of the +1 prim) is pointing to a unique memory location.

This data flow paradigm has sever consquences:

  • Parallelism is nativ :wub:
  • We have a duality of by-val (native) and by-ref, while text based compilers are limited to pointer :throwpc:/by-ref.

Felix

Edited by Black Pearl
  • Like 1
Link to comment

Thanks for that positive story.

Here I would disagree. There is a very important difference between data-flow languages and all other languages.

At first, this is a bit hidden in the compiler (there are some good explanations in wikipedia, and of course some pretty detailed issues posted mainly by AQ):

In a text based language I can write

i=i+1

where the data before and after this operation is stored at the same memory location.

In LV you can't. Each wire (upstream and downstream of the +1 prim) is pointing to a unique memory location.

This data flow paradigm has sever consquences:

  • Parallelism is nativ :wub:
  • We have a duality of by-val (native) and by-ref, while text based compilers are limited to pointer :throwpc:/by-ref.

Felix

I think the point is about the representation of the bits in a meaningful way to us humans. I completely agree that the difference between dataflow and everything else is huge! But dataflow and parallelism can be unintuitively implemented in any text based language. They are obviously much more difficult to see at a glance.

But I think the philosophical principle of Linguistic Relativity applied to programming languages is more interesting than what happens after the compiler gets done with the code.

Link to comment

... When we talk about electronics, we look at the schematics and not at text. If we need a mechanical construction, we also look at drawings. And whenever we do design some optics, we have beams, surfaces and lenses drawn. And when something is measured/characterized, it's also graphically plotted.

...

Felix

When I first saw LV these were my very first thoughts. It made complete sense.

Link to comment
When we talk about electronics, we look at the schematics and not at text. If we need a mechanical construction, we also look at drawings. And whenever we do design some optics, we have beams, surfaces and lenses drawn. And when something is measured/characterized, it's also graphically plotted.

So really weird to do text-based software. :frusty:

I like that!

Link to comment

Here I would disagree. There is a very important difference between data-flow languages and all other languages.

At first, this is a bit hidden in the compiler (there are some good explanations in wikipedia, and of course some pretty detailed issues posted mainly by AQ):

In a text based language I can write

i=i+1

where the data before and after this operation is stored at the same memory location.

In LV you can't. Each wire (upstream and downstream of the +1 prim) is pointing to a unique memory location.

NO!

If you program in such a way that the wire before the +1 primitive isn't branched at all the LabVIEW compiler will reuse that memory location.

Likewise it's important to remember that if you are adding a scalar and an array using the '+' primitive you need to wire the array to the upper input of the +, the compiler tries to use the same memory location for the output as for the upper input (-/* likewise).

Ton

Link to comment

If you program in such a way that the wire before the +1 primitive isn't branched at all the LabVIEW compiler will reuse that memory location.

You are right in this simplified example. The compiler can optimize this code.

But the conclusion (parallelism) is implying a branching (into multiple threads!).

Propably I get's a bit clearer if I extend the example:

j=i+1

foo(i)

i=j

Seems to be perfectly sane if i,j are ints. If they are pointer (e.g. to objects), we already have a race-condition desaster. Furthermore, I see no way a compiler could prevent this. The programmer will need to take care (e.g.

foo(i)

my_i=i.clone //this should happen before returning -> caller might change the object in the next line

return

)

Felix

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.