Jump to content

The Future of Programming


Recommended Posts

I can't help but get excited.  Because everything this guy talks about resonates with what NI has been trying to do with LabVIEW.  Also the fact that people rejected Fortran made me smile a little.  Because the same type of people, will reject LabVIEW for similar reasons.  Not that I can't be suborn at times.  I still don't use the auto-tool. 

Link to comment
Awesome, thank you Michael for posting this.

Actor Framework - 1973. Who knew!  :cool:

Actually, there are many ideas of parallel code development that were non-viable with older hardware tech, and so the ideas were declared dead. I'm finding lots of nuggets worth developing in the older CS research. Actors are very old, but there was no compiler that could generate good code for such a setup.

Link to comment
Actually, there are many ideas of parallel code development that were non-viable with older hardware tech, and so the ideas were declared dead. I'm finding lots of nuggets worth developing in the older CS research. Actors are very old, but there was no compiler that could generate good code for such a setup.

I found it quite interesting that the presenter was able to make us realise, through his clever time travelling presentation, that we've been stuck following a path of relatively narrow development in the computer science field. I always find it astonishing to read that the entrepreneurs of yesteryear were sometimes way ahead of their time. So far ahead that we can still learn from their profound ideas today.

Edited by Thoric
Link to comment

I've heard it said that Von Neumann's architecture is the single worst thing ever to happen to computer technology. Von Neumann gave us a system that Moore's Law applied to and so economics pushed every hardware maker into the same basic setup but there were other hardware architectures that would have given us acceleration even faster than Moore's Law. At the very least we would have benefited from more hardware research that broke that mold instead of just making that mold go faster. All research in the alternate structures dried up until we exhausted the Von Neumann setup. Now it is hard to break because of the amount of software that wouldn't upgrade to these alternate architectures. 

 

Don't know how true it is, but I like blaming the hardware for yet another software problem. ;-) 

  • Like 1
Link to comment

I'm not an expert on computer science, so my opinion doesn't carry much weight, but I'll stand by that argument any day  :lol:

 

I think the inadvertent development in massively paralleled architectures for the relatively isolated field of GPUs caught most manufacturers by surprise. Although there's a massive amount of momentum that needs to be reversed, they're almost falling over each other to catch up with the realisation that this architectural arrangement presents us with an easier path of progression throughout processor development in general.

 

In Bret's presentation he ends by pronouncing the design of a mass array of interconnected processes as the future (1980's future), which strikes a remarkable parallel (no pun intended) with our expectations of where CPU design is destined to take us in the coming decade. It's been a long time coming it seems...  :thumbup1:

Link to comment
I've heard it said that Von Neumann's architecture is the single worst thing ever to happen to computer technology. Von Neumann gave us a system that Moore's Law applied to and so economics pushed every hardware maker into the same basic setup but there were other hardware architectures that would have given us acceleration even faster than Moore's Law. At the very least we would have benefited from more hardware research that broke that mold instead of just making that mold go faster. All research in the alternate structures dried up until we exhausted the Von Neumann setup. Now it is hard to break because of the amount of software that wouldn't upgrade to these alternate architectures. 

 

Don't know how true it is, but I like blaming the hardware for yet another software problem. ;-) 

 

It's exactly true.  I remember all of the ideas of that presentation back IN that time, many before, and have been amazed by the linearity of much of the rest of CS since that time.

Link to comment

There is Kickstart project NoFlo for data flow programming with Javascript below the diagram.

http://www.kickstarter.com/projects/noflo/noflo-development-environment

 

AQ and other NI-staff it is surprising that LabVIEW is not mentioned at all on this wikipedia page http://en.wikipedia.org/wiki/Flow-based_programming

Edited by Anders Björk
Link to comment
AQ and other NI-staff it is surprising that LabVIEW is not mentioned at all on this wikipedia page http://en.wikipedia.org/wiki/Flow-based_programming

No one at NI can edit those pages -- would be a conflict of interest under Wikipedia rules. Please update the entry if you believe it can be approved.

 

And, while we're on the subject, everyone please consider adding a recurring donation to Wikipedia. They now have set up $3/month recurring donations, which makes it easy to support one of the most valuable databases on the planet.

  • Like 1
Link to comment
No one at NI can edit those pages -- would be a conflict of interest under Wikipedia rules. Please update the entry if you believe it can be approved.

 

And, while we're on the subject, everyone please consider adding a recurring donation to Wikipedia. They now have set up $3/month recurring donations, which makes it easy to support one of the most valuable databases on the planet.

 

I will look into how add Labview. It seems indeed a good idea to donate some $.

Link to comment
No one at NI can edit those pages -- would be a conflict of interest under Wikipedia rules. Please update the entry if you believe it can be approved.

 

And, while we're on the subject, everyone please consider adding a recurring donation to Wikipedia. They now have set up $3/month recurring donations, which makes it easy to support one of the most valuable databases on the planet.

 

I tend not to edit Wikipedia myself, I find my technical prose is usually inadequate. However, I just took a glance at the LabVIEW entry on Wikipedia and I am horrified. There are inaccuracies, irrelevancies, it's choppy, and generally poorly gathered. Too many cooks here I feel.

I wonder who is responsible for this content? I understand that the general public are required to maintain these pages, but is there a core group of enthusiasts that like to keep this page accurate?

I suspect not, because the leading paragraph still stated that the current release is LabVIEW 2012 - which tells me there are no LabVIEW enthusiasts who are also keen on keeping this page up to date. (I changed it to 2013 - my first Wiki alteration!)

  • Like 2
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.