Jump to content

Do I have to learn 'C'


alecjcook

Recommended Posts

I'm about to take my LabVIEW architect exam. LabVIEW is my first programming language. I do not know C. People keep telling me I should learn a 'real' language, something I'm sure many other LabVIEW programmers have been told. But should we all really learn C. Does anyone know C an find it makes them a better LabVIEW programmer?

   Every C programmer I have found that has moved across to LabVIEW I feel struggles some what compared to the native LabVIEW programmer. I fear learning C is going to make me a worse LabVIEW programmer. Moral support or beratment gladly accepted...
Cheers, Alec

Link to comment

The most irritating part of people saying "You should learn a real programming language like <insert language here>" is how none of them (at least in my experience) ever worked with LabVIEW more than a few hours. Some even say stuff like that after seeing code in LabVIEW the very first time.

Now to your question: I guess it does not hurt to understand the basics of textual languages. With the knowledge you can even give advice to people spouting stuff like above (I like how they hate it :P). A few years ago I watched the video series from Stanford on YouTube (Programming Paradims and Programming Methodology) which are really great and gave a good insight on how a computer actually works. I recommend looking into them.

 

I don't program in C though (by far to slow for me ;)) instead I've learned C# which is very useful when testing .NET components (or even writing one to use in LabVIEW). In the end my programming in LabVIEW did not change much, just the opposite, LabVIEW changed the way how I wrote programs in C# :lol:

 

Maybe all the textual programmers should consider learning 'G' instead :D

Edited by LogMAN
Link to comment

I'm biased (as is probably most on here).  Professionally I've only ever done LabVIEW and I've seen no career limiting issues.  In college CS101/102 was Java, I've leaved basic C++ with an Arduino, and I've done some minor .Net and C# but I don't put those last two on a resume.

 

That being said I think only knowing LabVIEW is doing a disservice to yourself, and NI.  I mean LabVIEW isn't perfect, and other languages do some things better than LabVIEW and some things are easier to do in LabVIEW than traditional languages.  Learning the pros and cons about each can help make decisions about the right tools to use, and can help discussions with those who might tell you, you need to learn C.

Link to comment

LabVIEW is not my first language (my first was BASIC on a Radio Shack Color Computer 2), but it is the only one I have learned to where I use it professionally.

 

No, you don't "have to learn". Can it be beneficial... yes. For me, personally, I interface with a lot of hardware and have been working with user interfaces. I've not programmed in other languages to do this, but I certainly have had to read other languages to find out how to do something. There have been times I've been trying to solve a problem that (at least at the time) had no ready-made-solution in LabVIEW; looking up algorithms in different languages and understanding what was being done and why it is done that way in that language allowed me to write an efficient routine in LabVIEW.

 

For full disclosure, I have been taking an online course on C#.

Link to comment

My usual retort to this sort of jibe is:

"Real programmers use a number of languages and preferably one that isn't over 40 years old-do you also know Algol?"

 

Learning another language won't make you a better LabVIEW programmer - they are generally different paradigms so require a different thought process. What it will enable you to do is fill in the gaps in LabVIEW capabilities and/or leverage other code.

 

My preferred "alt" is actually "Free Pascal" for which I use CodeTyphon but Codeblocks is my mainstay for C/C++. Since C/C++ programmers are 10-a-penny (it's generational so the numbers are dwindling) and there are a lot of projects written in C/C++ which are must-haves for me..

 

I personally wouldn't jump straight to C as next in line after LabVIEW. I would probably opt for Python which is a much more vibrant and growing community even though it is (ahem) interpreted. It is also a much better option for web services (where all software is heading) so will have much more future relevance. Some others might argue for Javascript but that's just scripted and interpreted C with obfuscation features (AKA client side PHP), as far as I'm concerned. Python seems more thoughtful and designed as well as yielding elegant code.

 

TL;DR

If you want to compile DLLS (there be dragons); learn a bit of C. If you want to learn another language. Learn Python.

Link to comment

Ah, that's a big one. Forgot to include it in my diagram.

 

Well. It depends if you are a LabVIEW island in a C [sic] of programmers. Then you only have to learn that you *must* specify the DLL function prototypes and keep shouting "Threadsafe, Threadsafe, Threadsafe" until their ears bleed :D No need to learn C/C++ then :P

 

I think every LabVIEW programmer should have a pet C programmer ;)

Edited by ShaunR
  • Like 2
Link to comment

Learning more programming languages can only make you a better programmer, not a worse one. Sure, C programmers are often lousy LabVIEW programmers at first because the model is so different, but that doesn't mean they don't (or can't) eventually become proficient, and that's certainly not unique to LabVIEW. C programmers also tend to write poor code in functional languages when they first get started; for that matter, they struggle with Matlab's model for vectorizing array operations (I say this having learned BASIC, then C, and struggled initially with both LabVIEW and Matlab in college).

 

If you're going to learn another language, C is a good choice because it's:

- simple (very few keywords)

- been around a long time, still in use, stable and well-established

- the basis for so many other languages (even in LabVIEW: where did the funny % format specifiers come from? The C scanf/printf functions)

- the standard for defining the interface for functions in shared libraries (not just DLLs on Windows, but on other platforms too)

 

I'll leave aside the "real language" question, other than to note that whether a language is text-based is irrelevant. Very early computer programming involved literally connecting wires, and the people who were proficient at that probably thought C wasn't "real" programming too ;) (yes, I know I'm skipping years of development there, including punch cards and assembly)

Link to comment

To beat a dead horse:

 

You live in a land where people speak English.  Everyone around you speaks English, so it's totally ok if you don't learn another language.

 

If you want to visit another place where people speak a language completely foreign to you (let's say Japan), if you want to communicate effectively, you may want to learn that language first.  You'll find things you've never considered doing, like having no future tense, or conjugating adjectives, common place in this new language.  There are things which can be communicated much more effectively there, but some things you won't be able to say at all, or will take forever to explain.

 

Do you have to learn Japanese? Not really, I'm sure people there would love to practice English, and you could get by.  Would it help you understand their point of view? Probably.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.