Just someone who needs a more graphical explanation. :-)Big-O notation is all about asking the question, "As my data set gets bigger, how much longer will this operation take?" Consider two decks of cards: a standard 52-card playing deck and an Uno deck with 108 cards.
Both of them are in boxes. I can take each deck out of its box in about the same amount of time -- I just dump all the cards out into my hand at once. Since it takes the same amount of time no matter how big the deck is, we call this constant time. As the deck gets bigger, the operation is the same length. Obviously there's an upper bound in this example since at some point the deck will be big enough it'll be bigger than my hand, but we're talking theory here. :-) The time is some constant k. Since the shape of a the graph is the real point of interest, it doesn't matter what number we pick, so this is called O(1).
If I sit at my desk and flip each of the cards into the trashcan, one per second, it'll take me 52 seconds for the standard deck and 108 seconds for the Uno deck. That's an operation that grows linearly -- as the size of the deck gets bigger, the time to do the operation goes up as a pure multiplier. In this case, the multiplier is "1 second". If we say that the size of the deck is N, then the function to find the time is k*N. Or, since the shape of the graph is all we care about, we can ignore the constant, and this is just O(N).
Ok. I tossed all the cards in a trash can one by one. Now I need them back in sorted order. Oh dear. Now, I could get lucky -- I could scoop them off the floor and discover that by sheer chance I have picked them up in order. That's the best case. But big-O is all about the *worst* case. In the worst case, I have to pick up the cards and sort them. Suppose I were to scoop up all the cards and then leaf through them to find the 2 Clubs, and put that on top of the deck. Then look through for the 3 Clubs and put that on top, and repeat for every card. I could be very unlucky -- when I search for each card, it might be the very last card in the deck. If the cards are exactly in reverse order, worst case, then for each card, I have to look through all the remaining cards. Ug. The first few cards are very fast (there aren't many cards to look through), but it gets slower as I go. The early cards and the later cards average out, so for N/2 cards, you have to look through N/2 existing cards. N/2 * N/2 = N^2/4. Again, we don't care about the constant, so this is just O(N^2).
Searching functions are generally O(N) for unsorted data and O(log N) for sorted data. Sorting functions are generally O(N^2) for dumb-but-easy-to-implement algorithms and O(N log N) for intellegent-but-often-have-off-by-one-errors algorithms.
We call any function that is N^2, or N^3 or N to any power a "polynomial time function". These are generally tractable, useful algorithms. We call any algorithm that is 2^N or 3^N or any other constant raised to the N exponential. These often require more time to solve than is available for the time we have to solve the problem. Then there are the N^N algorithms or the N-star algorithms which generally cannot be solved in the life of the universe. N-star means if there is 1 item, the algorithm takes 1 second. 2 items takes 2^2 seconds. 3 items takes 3^3^3. 4 items takes 4^4^4^4, and so on. Ug. :-)
Luckily, we don't have to guess. It's called math. :-)