Wednesday, December 3, 2014

Week 10: This will be brief.

We learned big-Oh proofs for general functions this week, so that we we can say things like





and proceed to amaze our friends. Then we moved on to halting functions...
...
...
... ???????????????????????

Week 9: It wasn't me, it was Netflix

9-10pm on November 3 was not my finest hour. I really should have gone over the hints given in class, I had completely forgotten about them! To give you the general idea, the hour before the assignment 2 deadline involved a lot of frantic scribbling, angry muttering directed at inanimate objects, and various large gestures of exasperation and defeat. Good thing I found a cubicle at the library, or I would have been exposed as a crazy person.

What led to this madness? Well as predicted, numerous Walking Dead episodes were involved (don't blame me, my roommate was the enabler!). Zombies are indeed gross.

So, you might say...
assume I did badly on an assignment
   then I either was subscribed to Netflix, or I was not subscribed to Netflix #A or not A
   case 1: I was subscribed to Netflix
      then I watched too many Netflix shows #this naturally follows because I have no self-restraint
      then Netflix distracted me from the assignment
      then it was Netflix's fault
   case 2: I was not subscribed to Netflix
      then I was suffering from Netflix withdrawal #that happens, right?
      then Netflix-lessness distracted me from the assignment
      then it was Netflix's fault
   then in either case, it was Netflix's fault
then I did badly on an assignment implies that it was Netflix's fault #sounds reasonable to me!


This week in lecture, we learned big-Oh proofs... calculus was involved. I felt comfortable with most of the things taught in class, but sometimes the examples were hard to follow when I'm just seeing (many, many, many) symbols on a screen and I haven't worked out the problem myself. Generally, I find that the lectures are a good introduction to the material, the tutorials provide very useful practice for consolidating what I've learned and for asking questions, and the course notes are the reason why I have to purchase a new pair of glasses this year (but they're still very helpful!).

Week 8: Let's be formal

This week we learned the formal definition of big O.



This means that f(n) is in O(n^2) if and only if after a certain breakpoint B (for all values of n >= B), the graph of f(n) is upper-bounded by the graph of cn^2 (where n^2 is multiplied by a big enough constant factor c). In other words, f(n) eventually grows at the same rate as n^2 at most, if not slower.









Similarly, we can define the lower-bound, when f(n) eventually grows at least as fast as n^2:




and the tight bound, when f(n) grows at the same rate as n^2:




 
Thus, we can say that 7n^2 is in Θ(n^2). From the graph below, we can see that 7n^2 is upper-bounded by 10n^2 and lower-bounded by n^2.





















We also calculated the number of steps an algorithm takes depending on the input size n in the worst case scenario, which turned out to be not as difficult as I had imagined. Of course, we were only looking at simple algorithms. I'm curious now as to what we will be learning in CSC236.

Tuesday, December 2, 2014

Week 7: :O :O :O

This week we finished up proof techniques and moved on to program complexity, mainly big O and the number of steps of an algorithm.

This is the first real thing that relates mathematical reasoning to computer science, which makes it interesting for me. Since my old brick / laptop was rather slow and outdated, I understand well the importance of running time analysis. I still find it strange that big O ignores constant factors and lower order terms, so that 100000n^3 + n is considered the same as 0.1n^3, even though in "real running time", this makes a difference (at least it did on my brick). I guess the key thing to be aware of is not to confuse the number of steps of a program with the time the program takes to run on some hardware. The same program will be slow on a slow computer and fast on a fast computer, but the running time does not differ between computers and is not measured as units of time, but as units related to the number of steps, determined solely based on the algorithm. Specifically, running time looks at how quickly the number of steps carried out by the algorithm grows as the input size grows. Relating this to real life, often we want to look through a large number of items in a short time, such as when you google "computer science" and google proudly reports that it found about 272,000,000 results in 0.52 seconds. When we are calculating the number of steps for large input sizes, only the highest order term matters, and the rest only contribute a small difference. Therefore, it is more useful to say that 100000n^3 + n grows no faster than n^3, asymptotically.

I guess now we're going to have to prove that some algorithm is O(n)? Nooooooooooooooooooooooo..............

Week 6: So many proofs

This week we learned proofs for non-boolean functions such as floor and ceiling functions, and for limits, which I found a little challenging. (I am also deeply distraught that proof by illegibility is not valid. Fine... you caught me.)

If I had thought that the definition of a limit was confusing, the proof for it is something else. From last week, we learned that






can be defined as





so that as ε gets smaller, x approaches 2, and as δ gets smaller, f(x) = x2 approaches 4. To prove this, we must show that for all positive values of ε, there is a positive value δ such that if | x - 2 | < δ then | x2 - 4 | < ε. The key to this proof is to choose an appropriate δ, in terms of ε, such that allows us to transition from | x - 2 | < δ to  | x2 - 4 | < ε. This is a complicated step, and involves some clever manipulations. It took several minutes of stupefied staring at the lecture notes to fully understand the choice of δ for this proof. Further stupefied staring at the mass of confusion that is course notes and I begin to see that writing proofs is not just about thinking logically; it involves searching outside the box for a creative solution, then presenting it logically.





Sunday, November 30, 2014

Week 5: More proofs

I'm glad to have assignment 1 and term test 1 behind me. I had felt very unsure about everything in assignment 1, but one trip to Larry's office hours turned out to be very helpful and productive. Question 5 had made no sense to me, but I walked out of there feeling confident about my answers. I felt that I did decently on the test as well. Thank goodness for the practice tests, they were excellent practice and prepared me well so that there were no surprises on the test.

This week, we learned more proof techniques. In general, the material seemed straightforward, as it always does once the answer is in front already of you. Proofs are meant to be read easily, but I'm starting to realize that actually coming up with a proof requires a substantial amount of creativity. I will need to do many practice problems if I hope to do well on the next test. For the next assignment, I anticipate a lot of staring blankly at the questions, some useless scribbling, then giving up and binge watching The Walking Dead (zombies are really gross).

Week 4: You can't prove that I didn't write this in October

This week we talked about transitivity, limits, and structured proofs.

The definition of a limit was a little hard for me to understand. For example, take the function we saw in lecture, f(x) = x^2:







When I had first seen something similar to this in a calculus textbook, it made no sense to me. But in the context of this lecture, one way that I think of it is: "If my enemy chooses any e, then I can choose a d such that for all x-values that differ from 2 by less than d, the corresponding f(x) values will differ from 4 by less than e." So for some function g(x):














From this, I can see how as d gets smaller (x → xo), e can also get smaller (y → yo), so that:







We also started learning about proofs, and looked at some examples for direct proofs of universally quantified implications. I appreciate how rigorous proofs are, and how clearly and throughly one step must lead to the next, so that ultimately P leads to Q, and the implication is undeniably true.
Then we came across the "unexpected hanging paradox", and for a second I felt like I had just unlearned everything from the last twenty minutes!