Jump to content

desire to learn CS fundamentals


jcarmody

Recommended Posts

I've been following these forums for over five years now and can't articulate how much I get out of reading the threads. (I credit passing my CLD-R to reading them.) Over the years I've recognized the gulf that exists between my background and many of yours and have, more and more, wanted to narrow it. (I've often felt guilty about participating in a forum for Advanced Architects, because I'm neither.) I'd like to begin to change that, however, by studying the fundamentals of Computer Science.

Higher education is changing drastically in many exciting ways (don't get me started on how it' looks like a bubble getting ready to burst) and I've begun participating by taking some free, online classes. I'm better than half-way through CS253, Web Application Engineering, on Udacity.com and have signed up for Algorithms, Part 1 and Statistics 1 on Coursera.org starting in August. I've seen many other courses I'd like to take, and I'd like to ask for advice on how to begin.

Specifically, Coursera is going to offer a class on Automata, based on a 100-level course taught at Stanford University. While reading the course description I followed the link to

a free on-line textbook Foundations of Computer Science, written by Al Aho and me [Jeff Ullman], available at http://i.stanford.edu/~ullman/focs.html

I read the introduction and table of contents and think it's worth buying (don't have to, though, 'cuz it's online :)), and the course worth taking.

My questions for you are: will the CS courses mentioned above help me move in my desired direction, and what other resources do you recommend for beginning a Computer Science education? I understand that what I'm going to do will take a lot of effort and time and I'm open to taking college classes (but I don't prefer that route), online courses, reading books (purchased or online)... Just about anything, really. Do you have a favorite book that will help? I'd appreciate any advice you can give.

Thank you.

  • Like 1
Link to comment

As a preface, I did a minor in CS, so I've taken a reasonable number of Computer Science classes at the undergraduate college level. Not surprisingly, none of them were taught in LabVIEW, and some concepts do not translate easily into LabVIEW, but I do find that the CS background helps me identify situations where there's a standard pattern or algorithm that can solve a problem, and also allows me a basic understanding of what LabVIEW is doing "behind the scenes."

The Algorithms class looks like a very good start. A basic understanding of Statistics is generally useful for anyone dealing with data in a lab, although not specifically for computer science. I'm going to guess that the Automata class is more advanced and you may want to finish the Algorithms first.

For me at least, there's no substitute for writing code - I won't properly understand an algorithm until I've implemented it myself. However, in my opinion (and I welcome other opinions on this), some patterns are difficult to implement in LabVIEW without first doing them in a text-based language, simply because most courses assume that you'll be using a text-based language. In addition, learning at least one traditional text-based language (I'm thinking C or Java here) is worth your time - using multiple languages makes it easier to think about algorithms in a generic sense rather than an implementation in one specific language. The classic C text, "The C Programming Language" by Kernighan and Ritchie, is a good way to get started with C and if you have the NI Developer Suite then you already have access to a C compiler.

Link to comment

Thanks for your response.

learning at least one traditional text-based language (I'm thinking C or Java here) is worth your time - using multiple languages makes it easier to think about algorithms in a generic sense rather than an implementation in one specific language.

I've got some experience with text programming - BASIC, VB, Fortran, C, Java and now Python. I'm thinking I'm going to go with Python because of its popularity and it works well with Google's App Engine. I'm going to stay away from C if I can. Python!

Link to comment

As a CS prof once said: To become a good programmer you need to write code every day for two years. To become a great programmer you need to write code every day for ten years OR write code every day for two years and take an algorithms class.

  • Like 2
Link to comment

I personally see no better education at fundamental level that A Level Computing from AQA board. This is a UK board qualification and you could take exam in January or May. Exam is the best part, scoring an A might get you in Howard university.

I wont mind taking A Level ICT and Applied ICT as well.

Link to comment

As a CS prof once said: To become a good programmer you need to write code every day for two years. To become a great programmer you need to write code every day for ten years OR write code every day for two years and take an algorithms class.

Thanks for the comment. I feel better about planning on the Coursera Algorithms course(s).

I personally see no better education at fundamental level that A Level Computing from AQA board. [...]

I looked at their website; it's interesting. Thanks.

I'd seen, and been intrigued by, Structure and Interpretation of Computer Programs at MIT OpenCourseWare before. Do any of you have any experience with this book/course? Scheme would be an interesting language to learn, and it would help me with Emacs/LISP.

Link to comment

I'd seen, and been intrigued by, Structure and Interpretation of Computer Programs at MIT OpenCourseWare before. Do any of you have any experience with this book/course? Scheme would be an interesting language to learn, and it would help me with Emacs/LISP.

I took that class when one of the authors of the book was teaching it. It will certainly teach you Scheme, an interesting if n

I'd seen, and been intrigued by, Structure and Interpretation of Computer Programs at MIT OpenCourseWare before. Do any of you have any experience with this book/course? Scheme would be an interesting language to learn, and it would help me with Emacs/LISP.

I took that class when one of the authors of the book was teaching it. It will certainly teach you Scheme, an interesting if not useful exercise. It explains why I missed recursion in LV2. You should take it so one more person gets the joke when I say sh#t-bang.

Funky semi-double post from iPhone, no idea how to edit.

Link to comment

Have you taken the Udacity CS101 class? How does it compare to the CS253 class? I took the CS101 then CS212 class, but wound up "dropping" CS212 as I was still just learning Python from CS101 and didn't have the time to catch up. How does Coursera compare?

Link to comment

Have you taken the Udacity CS101 class?

No. It looks interesting, though, and I may preview the class to see how they build a web crawler. I had played with Python before and have been able to google answers to the questions I've had.

I took the CS101 then CS212 class

That looks like one I'd need to take. How far did you get and what did you think of it?

How does Coursera compare?

It hasn't started yet.

Link to comment

That looks like one I'd need to take. How far did you get and what did you think of it?

It was a huge leap from CS101. CS101 was about 3-4 hours of lecture a week and maybe 5 hours of homework. CS212 was (by my recollection) 6 - 8 hours of lecture and it was taking me 12 - 20 hours to do the homework. My main difficulty was familiarity with the libraries in Python and how to use (exploit) them. I had to call it quits at week 2 (of 7) as I didn't have that kind of time to dedicate. I didn't use the forums for CS101 and started using them for CS212 (they really helped with a couple of questions).

I just checked and I'm still taking the course despite it being past the exam. I'm planning on going back and at least watching the lectures as there was good content in there.

  • Like 1
Link to comment
  • 1 month later...

Luckily, we don't have to guess. It's called math. :-)

A guesstimate is not a guess. It is an estimate without having all the facts. As Order Of Growth is a simplified relational analysis and usually only denotes a upper bound; it is a "guesstimate" however much maths you put around it. It is a very crude way of describing processes.

Link to comment

I understand the things you've both written. I also think that I understand that a brute-force approach to the "Three Sum" problem is ~N^3 because it nests loops and operates N*N*N times. I guess I understand that expanding a binary search process will lead to 'log N', but I'd only be guessing and accepting some maths that I've either forgotten or never learned.

I'm studying Union-Find algorithms this week, too. The programming assignment involves solving a percolation problem. I haven't read the problem statement, but the lectures introduced the concept and I believe I can implement an efficient algorithm because the lecture told me that the "weighted quick-union with path compression" is the best approach.

I should probably discuss this in Coursera's forums but I like the people here so much! :D I'm going to keep plugging away on the off chance that something will click and I'll learn something. Thanks for your comments.

Link to comment

Sounds like your course is using Sedgewick's book. Most order of growth problems can be attacked in one of two ways: finding loop invariants and counting loop iterations or writing out a tree for divide and conquer algorithms. In the first case you run into the arithmetic series 1+2+3..+N which is O(N^2) a lot. For the second case imagine turning the recurrence relation for a simple divide and conquer algorithm as a binary tree: the first step divides a problem of size N into two of size N/2, the next level has four problems of size N/4 and so on. The order of the total time is the sum of all the nodes, an easy way to find the sum is to notice that each level sums to N and there are lg(N) levels so the sum is Nlg(N).

Maybe there would be demand for my half-written book after all. I have implemented many classics algorithms in LV showing the predicted and measured Growth and detail some nuances of implementations in a by value language. Still working on some computational geometry.

Link to comment

A guesstimate is not a guess. It is an estimate without having all the facts. As Order Of Growth is a simplified relational analysis and usually only denotes a upper bound; it is a "guesstimate" however much maths you put around it. It is a very crude way of describing processes.

Ah. I've always heard the term in the context of "pulling numbers out of dark orifices." As in, "yeah, that timeline I gave my manager was a total guesstimate."
Link to comment

I'm late to this, but I wanted to make a recommendation for a project you should do someday:

Build a compiler for a programming language. Just a small one.

Sounds complicated, right? Really, it's not that bad. At its core, a compiler just translates information from one form to another. I guarantee, though, that you will learn more about how CS is applied than in just about any other project. And, it turns out, a huge number of day to day, real world problems can be solved with the exact same ideas that compilers implement.

There are tons of resources on Google for compiler building.

Good luck!

Joe Z.

  • Like 1
Link to comment

I've seen that, only the line was black and there were fewer data points.

Somehow my mind wandered from here to an old Isaac Asimov short - The Feeling of Power

[...]

"Yes. I know," said the little Technician earnestly, "but I start by saying seven times three because that's the way it works. Now seven times three is twenty-one.

"And how do you know that?" asked the congressman.

"I just remember it. It's always twenty-one on the computer. I've checked it any number of times."

"That doesn't mean it always will be though, does it?" said the congressman.

"Maybe not," stammered Aub. "I'm not a mathematician. But I always get the right answers, you see."

"Go on."

I made some progress last night re-reading the material and reviewing some of the lectures, but I may always feel like this technician. :)

There was an instant's silence and then General Weider said, "I don't believe it. He goes through this rigmarole and makes up numbers and multiplies and adds them this way and that, but I don't believe it. It's too complicated to be anything but hornswoggling."
Link to comment

Another project, also quite small but quite revealing, is, in any text based language, write a program that emits its own source code.

(I've tried doing this in LabVIEW, and the process is very similar but so tedious as to be overwhelming... you need to build a scripting code generator par excellence.)

Link to comment

I made some progress last night re-reading the material and reviewing some of the lectures, but I may always feel like this technician. :)

Don't lose hope. In my experience, much of schooling consists of pounding the information into your brain until it becomes rote and at some point along the way that 'aha!' moment happens.

Link to comment

For computer science I would like to recommend the book.

T.H. Cormen, C.E. Leiserson, R.L. Rivest and C. Stein.

Introduction to Algorithms (3rd edition)

MIT Press, 2009.

ISBN 0-262-53305-7 (paperback)

It is made by a few proffessors from MIT, you can also see colleges

ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-046j-introduction-to-algorithms-sma-5503-fall-2005/

ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-006-introduction-to-algorithms-spring-2008/

You can also download the video colleges as torrent, torrentz.eu/search?f=introduction+to+algorithms

Further the following college notes may help you to get familiar.

The first notes is about automata theory, it contains a few errors because it is not the most recent one because the author has decided to make a book from it. http://www.win.tue.n.../apbook2010.pdf. The second notes is about system validation and is a follow-up for automata theory http://www.win.tue.n...ted-version.pdf.

Edited by Wouter
Link to comment

Another project, also quite small but quite revealing, is, in any text based language, write a program that emits its own source code.

(I've tried doing this in LabVIEW, and the process is very similar but so tedious as to be overwhelming... you need to build a scripting code generator par excellence.)

Ah, quines. If you check the bottom of that article, you'll see there are some people who have way too much time on their hands. Tangentially related, Tupper's self-referential formula is a little bit of the same idea, though there's a bit of cheating using the definition of a constant.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.