jcarmody Posted June 30, 2012 Report Share Posted June 30, 2012 I've been following these forums for over five years now and can't articulate how much I get out of reading the threads. (I credit passing my CLD-R to reading them.) Over the years I've recognized the gulf that exists between my background and many of yours and have, more and more, wanted to narrow it. (I've often felt guilty about participating in a forum for Advanced Architects, because I'm neither.) I'd like to begin to change that, however, by studying the fundamentals of Computer Science. Higher education is changing drastically in many exciting ways (don't get me started on how it' looks like a bubble getting ready to burst) and I've begun participating by taking some free, online classes. I'm better than half-way through CS253, Web Application Engineering, on Udacity.com and have signed up for Algorithms, Part 1 and Statistics 1 on Coursera.org starting in August. I've seen many other courses I'd like to take, and I'd like to ask for advice on how to begin. Specifically, Coursera is going to offer a class on Automata, based on a 100-level course taught at Stanford University. While reading the course description I followed the link to a free on-line textbook Foundations of Computer Science, written by Al Aho and me [Jeff Ullman], available at http://i.stanford.edu/~ullman/focs.html I read the introduction and table of contents and think it's worth buying (don't have to, though, 'cuz it's online ), and the course worth taking. My questions for you are: will the CS courses mentioned above help me move in my desired direction, and what other resources do you recommend for beginning a Computer Science education? I understand that what I'm going to do will take a lot of effort and time and I'm open to taking college classes (but I don't prefer that route), online courses, reading books (purchased or online)... Just about anything, really. Do you have a favorite book that will help? I'd appreciate any advice you can give. Thank you. 1 Quote Link to comment
ned Posted June 30, 2012 Report Share Posted June 30, 2012 As a preface, I did a minor in CS, so I've taken a reasonable number of Computer Science classes at the undergraduate college level. Not surprisingly, none of them were taught in LabVIEW, and some concepts do not translate easily into LabVIEW, but I do find that the CS background helps me identify situations where there's a standard pattern or algorithm that can solve a problem, and also allows me a basic understanding of what LabVIEW is doing "behind the scenes." The Algorithms class looks like a very good start. A basic understanding of Statistics is generally useful for anyone dealing with data in a lab, although not specifically for computer science. I'm going to guess that the Automata class is more advanced and you may want to finish the Algorithms first. For me at least, there's no substitute for writing code - I won't properly understand an algorithm until I've implemented it myself. However, in my opinion (and I welcome other opinions on this), some patterns are difficult to implement in LabVIEW without first doing them in a text-based language, simply because most courses assume that you'll be using a text-based language. In addition, learning at least one traditional text-based language (I'm thinking C or Java here) is worth your time - using multiple languages makes it easier to think about algorithms in a generic sense rather than an implementation in one specific language. The classic C text, "The C Programming Language" by Kernighan and Ritchie, is a good way to get started with C and if you have the NI Developer Suite then you already have access to a C compiler. Quote Link to comment
jcarmody Posted June 30, 2012 Author Report Share Posted June 30, 2012 Thanks for your response. learning at least one traditional text-based language (I'm thinking C or Java here) is worth your time - using multiple languages makes it easier to think about algorithms in a generic sense rather than an implementation in one specific language. I've got some experience with text programming - BASIC, VB, Fortran, C, Java and now Python. I'm thinking I'm going to go with Python because of its popularity and it works well with Google's App Engine. I'm going to stay away from C if I can. Python! Quote Link to comment
Darin Posted June 30, 2012 Report Share Posted June 30, 2012 As a CS prof once said: To become a good programmer you need to write code every day for two years. To become a great programmer you need to write code every day for ten years OR write code every day for two years and take an algorithms class. 2 Quote Link to comment
KWaris Posted June 30, 2012 Report Share Posted June 30, 2012 I personally see no better education at fundamental level that A Level Computing from AQA board. This is a UK board qualification and you could take exam in January or May. Exam is the best part, scoring an A might get you in Howard university. I wont mind taking A Level ICT and Applied ICT as well. Quote Link to comment
jcarmody Posted June 30, 2012 Author Report Share Posted June 30, 2012 As a CS prof once said: To become a good programmer you need to write code every day for two years. To become a great programmer you need to write code every day for ten years OR write code every day for two years and take an algorithms class. Thanks for the comment. I feel better about planning on the Coursera Algorithms course(s). I personally see no better education at fundamental level that A Level Computing from AQA board. [...] I looked at their website; it's interesting. Thanks.I'd seen, and been intrigued by, Structure and Interpretation of Computer Programs at MIT OpenCourseWare before. Do any of you have any experience with this book/course? Scheme would be an interesting language to learn, and it would help me with Emacs/LISP. Quote Link to comment
Darin Posted June 30, 2012 Report Share Posted June 30, 2012 I'd seen, and been intrigued by, Structure and Interpretation of Computer Programs at MIT OpenCourseWare before. Do any of you have any experience with this book/course? Scheme would be an interesting language to learn, and it would help me with Emacs/LISP. I took that class when one of the authors of the book was teaching it. It will certainly teach you Scheme, an interesting if n I'd seen, and been intrigued by, Structure and Interpretation of Computer Programs at MIT OpenCourseWare before. Do any of you have any experience with this book/course? Scheme would be an interesting language to learn, and it would help me with Emacs/LISP. I took that class when one of the authors of the book was teaching it. It will certainly teach you Scheme, an interesting if not useful exercise. It explains why I missed recursion in LV2. You should take it so one more person gets the joke when I say sh#t-bang. Funky semi-double post from iPhone, no idea how to edit. Quote Link to comment
Tim_S Posted July 2, 2012 Report Share Posted July 2, 2012 Have you taken the Udacity CS101 class? How does it compare to the CS253 class? I took the CS101 then CS212 class, but wound up "dropping" CS212 as I was still just learning Python from CS101 and didn't have the time to catch up. How does Coursera compare? Quote Link to comment
jcarmody Posted July 2, 2012 Author Report Share Posted July 2, 2012 Have you taken the Udacity CS101 class? No. It looks interesting, though, and I may preview the class to see how they build a web crawler. I had played with Python before and have been able to google answers to the questions I've had. I took the CS101 then CS212 class That looks like one I'd need to take. How far did you get and what did you think of it? How does Coursera compare? It hasn't started yet. Quote Link to comment
Tim_S Posted July 2, 2012 Report Share Posted July 2, 2012 That looks like one I'd need to take. How far did you get and what did you think of it? It was a huge leap from CS101. CS101 was about 3-4 hours of lecture a week and maybe 5 hours of homework. CS212 was (by my recollection) 6 - 8 hours of lecture and it was taking me 12 - 20 hours to do the homework. My main difficulty was familiarity with the libraries in Python and how to use (exploit) them. I had to call it quits at week 2 (of 7) as I didn't have that kind of time to dedicate. I didn't use the forums for CS101 and started using them for CS212 (they really helped with a couple of questions). I just checked and I'm still taking the course despite it being past the exam. I'm planning on going back and at least watching the lectures as there was good content in there. 1 Quote Link to comment
jcarmody Posted August 13, 2012 Author Report Share Posted August 13, 2012 Algorithms Part 1 began today... What do you call a programmer that doesn't understand order-of-growth calculations? Hopeless? Quote Link to comment
ShaunR Posted August 13, 2012 Report Share Posted August 13, 2012 Algorithms Part 1 began today... What do you call a programmer that doesn't understand order-of-growth calculations? Hopeless? Perhaps you are just more use to the older terminology of "guesstimate" Quote Link to comment
Popular Post Aristos Queue Posted August 13, 2012 Popular Post Report Share Posted August 13, 2012 What do you call a programmer that doesn't understand order-of-growth calculations? Hopeless? Just someone who needs a more graphical explanation. :-)Big-O notation is all about asking the question, "As my data set gets bigger, how much longer will this operation take?" Consider two decks of cards: a standard 52-card playing deck and an Uno deck with 108 cards. Both of them are in boxes. I can take each deck out of its box in about the same amount of time -- I just dump all the cards out into my hand at once. Since it takes the same amount of time no matter how big the deck is, we call this constant time. As the deck gets bigger, the operation is the same length. Obviously there's an upper bound in this example since at some point the deck will be big enough it'll be bigger than my hand, but we're talking theory here. :-) The time is some constant k. Since the shape of a the graph is the real point of interest, it doesn't matter what number we pick, so this is called O(1). If I sit at my desk and flip each of the cards into the trashcan, one per second, it'll take me 52 seconds for the standard deck and 108 seconds for the Uno deck. That's an operation that grows linearly -- as the size of the deck gets bigger, the time to do the operation goes up as a pure multiplier. In this case, the multiplier is "1 second". If we say that the size of the deck is N, then the function to find the time is k*N. Or, since the shape of the graph is all we care about, we can ignore the constant, and this is just O(N). Ok. I tossed all the cards in a trash can one by one. Now I need them back in sorted order. Oh dear. Now, I could get lucky -- I could scoop them off the floor and discover that by sheer chance I have picked them up in order. That's the best case. But big-O is all about the *worst* case. In the worst case, I have to pick up the cards and sort them. Suppose I were to scoop up all the cards and then leaf through them to find the 2 Clubs, and put that on top of the deck. Then look through for the 3 Clubs and put that on top, and repeat for every card. I could be very unlucky -- when I search for each card, it might be the very last card in the deck. If the cards are exactly in reverse order, worst case, then for each card, I have to look through all the remaining cards. Ug. The first few cards are very fast (there aren't many cards to look through), but it gets slower as I go. The early cards and the later cards average out, so for N/2 cards, you have to look through N/2 existing cards. N/2 * N/2 = N^2/4. Again, we don't care about the constant, so this is just O(N^2). Searching functions are generally O(N) for unsorted data and O(log N) for sorted data. Sorting functions are generally O(N^2) for dumb-but-easy-to-implement algorithms and O(N log N) for intellegent-but-often-have-off-by-one-errors algorithms. We call any function that is N^2, or N^3 or N to any power a "polynomial time function". These are generally tractable, useful algorithms. We call any algorithm that is 2^N or 3^N or any other constant raised to the N exponential. These often require more time to solve than is available for the time we have to solve the problem. Then there are the N^N algorithms or the N-star algorithms which generally cannot be solved in the life of the universe. N-star means if there is 1 item, the algorithm takes 1 second. 2 items takes 2^2 seconds. 3 items takes 3^3^3. 4 items takes 4^4^4^4, and so on. Ug. :-) Perhaps you are just more use to the older terminology of "guesstimate" Luckily, we don't have to guess. It's called math. :-) 4 Quote Link to comment
ShaunR Posted August 13, 2012 Report Share Posted August 13, 2012 Luckily, we don't have to guess. It's called math. :-) A guesstimate is not a guess. It is an estimate without having all the facts. As Order Of Growth is a simplified relational analysis and usually only denotes a upper bound; it is a "guesstimate" however much maths you put around it. It is a very crude way of describing processes. Quote Link to comment
jcarmody Posted August 13, 2012 Author Report Share Posted August 13, 2012 I understand the things you've both written. I also think that I understand that a brute-force approach to the "Three Sum" problem is ~N^3 because it nests loops and operates N*N*N times. I guess I understand that expanding a binary search process will lead to 'log N', but I'd only be guessing and accepting some maths that I've either forgotten or never learned. I'm studying Union-Find algorithms this week, too. The programming assignment involves solving a percolation problem. I haven't read the problem statement, but the lectures introduced the concept and I believe I can implement an efficient algorithm because the lecture told me that the "weighted quick-union with path compression" is the best approach. I should probably discuss this in Coursera's forums but I like the people here so much! I'm going to keep plugging away on the off chance that something will click and I'll learn something. Thanks for your comments. Quote Link to comment
Darin Posted August 13, 2012 Report Share Posted August 13, 2012 Sounds like your course is using Sedgewick's book. Most order of growth problems can be attacked in one of two ways: finding loop invariants and counting loop iterations or writing out a tree for divide and conquer algorithms. In the first case you run into the arithmetic series 1+2+3..+N which is O(N^2) a lot. For the second case imagine turning the recurrence relation for a simple divide and conquer algorithm as a binary tree: the first step divides a problem of size N into two of size N/2, the next level has four problems of size N/4 and so on. The order of the total time is the sum of all the nodes, an easy way to find the sum is to notice that each level sums to N and there are lg(N) levels so the sum is Nlg(N). Maybe there would be demand for my half-written book after all. I have implemented many classics algorithms in LV showing the predicted and measured Growth and detail some nuances of implementations in a by value language. Still working on some computational geometry. Quote Link to comment
Aristos Queue Posted August 13, 2012 Report Share Posted August 13, 2012 A guesstimate is not a guess. It is an estimate without having all the facts. As Order Of Growth is a simplified relational analysis and usually only denotes a upper bound; it is a "guesstimate" however much maths you put around it. It is a very crude way of describing processes. Ah. I've always heard the term in the context of "pulling numbers out of dark orifices." As in, "yeah, that timeline I gave my manager was a total guesstimate." Quote Link to comment
jzoller Posted August 13, 2012 Report Share Posted August 13, 2012 I'm late to this, but I wanted to make a recommendation for a project you should do someday: Build a compiler for a programming language. Just a small one. Sounds complicated, right? Really, it's not that bad. At its core, a compiler just translates information from one form to another. I guarantee, though, that you will learn more about how CS is applied than in just about any other project. And, it turns out, a huge number of day to day, real world problems can be solved with the exact same ideas that compilers implement. There are tons of resources on Google for compiler building. Good luck! Joe Z. 1 Quote Link to comment
jcarmody Posted August 13, 2012 Author Report Share Posted August 13, 2012 Sounds like your course is using Sedgewick's book. Yes, Sedgewick is using his book for this course. Maybe there would be demand for my half-written book after all. I'd read it. Quote Link to comment
Darin Posted August 14, 2012 Report Share Posted August 14, 2012 The programming assignment involves solving a percolation problem. One of the examples I have worked out leads to a plot which may become familiar to you in the next few days. Quote Link to comment
jcarmody Posted August 14, 2012 Author Report Share Posted August 14, 2012 I've seen that, only the line was black and there were fewer data points. Somehow my mind wandered from here to an old Isaac Asimov short - The Feeling of Power [...]"Yes. I know," said the little Technician earnestly, "but I start by saying seven times three because that's the way it works. Now seven times three is twenty-one. "And how do you know that?" asked the congressman. "I just remember it. It's always twenty-one on the computer. I've checked it any number of times." "That doesn't mean it always will be though, does it?" said the congressman. "Maybe not," stammered Aub. "I'm not a mathematician. But I always get the right answers, you see." "Go on." I made some progress last night re-reading the material and reviewing some of the lectures, but I may always feel like this technician. There was an instant's silence and then General Weider said, "I don't believe it. He goes through this rigmarole and makes up numbers and multiplies and adds them this way and that, but I don't believe it. It's too complicated to be anything but hornswoggling." Quote Link to comment
Aristos Queue Posted August 15, 2012 Report Share Posted August 15, 2012 Another project, also quite small but quite revealing, is, in any text based language, write a program that emits its own source code. (I've tried doing this in LabVIEW, and the process is very similar but so tedious as to be overwhelming... you need to build a scripting code generator par excellence.) Quote Link to comment
Jordan Kuehn Posted August 15, 2012 Report Share Posted August 15, 2012 I made some progress last night re-reading the material and reviewing some of the lectures, but I may always feel like this technician. Don't lose hope. In my experience, much of schooling consists of pounding the information into your brain until it becomes rote and at some point along the way that 'aha!' moment happens. Quote Link to comment
Wouter Posted August 15, 2012 Report Share Posted August 15, 2012 (edited) For computer science I would like to recommend the book. T.H. Cormen, C.E. Leiserson, R.L. Rivest and C. Stein. Introduction to Algorithms (3rd edition) MIT Press, 2009. ISBN 0-262-53305-7 (paperback) It is made by a few proffessors from MIT, you can also see colleges ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-046j-introduction-to-algorithms-sma-5503-fall-2005/ ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-006-introduction-to-algorithms-spring-2008/ You can also download the video colleges as torrent, torrentz.eu/search?f=introduction+to+algorithms Further the following college notes may help you to get familiar. The first notes is about automata theory, it contains a few errors because it is not the most recent one because the author has decided to make a book from it. http://www.win.tue.n.../apbook2010.pdf. The second notes is about system validation and is a follow-up for automata theory http://www.win.tue.n...ted-version.pdf. Edited August 15, 2012 by Wouter Quote Link to comment
asbo Posted August 15, 2012 Report Share Posted August 15, 2012 Another project, also quite small but quite revealing, is, in any text based language, write a program that emits its own source code. (I've tried doing this in LabVIEW, and the process is very similar but so tedious as to be overwhelming... you need to build a scripting code generator par excellence.) Ah, quines. If you check the bottom of that article, you'll see there are some people who have way too much time on their hands. Tangentially related, Tupper's self-referential formula is a little bit of the same idea, though there's a bit of cheating using the definition of a constant. Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.