Friday, October 17, 2008

Week 6: The Master Theorem

Thanks to Thanksgiving, we only had one less lecture this week. The topics we ended up covering were the time complexity of merge sort, monotonicity, and the master theorem. The proof of the time complexity of merge sort was a little more dense than I expected it to be but, looking back at it, it was pretty straight forward. First you prove a conjecture about a couple of points (in our case, 2^k for all natural numbers k) and then use the principle of monotonicity to solve for everything between those points.

To me, proving monotonicity doesn't seem to be any different than proving that a given function is always increasing. Before Danny gave us the definition of monotonicity that we were going to use in a proof, I expected that we were just going to strive to show that T(n) always existed and was well defined for any value of n. I guess proving that T(n) is always increasing also does this while giving us another helpful fact (that T(n) is an increasing function) at the same time.

The master theorem was revealed to us on Friday morning and it was interesting to say the least. I wouldn't mind seeing a proof for it (though I doubt I'll be able to understand the whole thing and I expect it's probably one of the longest proofs I've ever seen) because I wouldn't know where to start proving something like that. It seems to be very useful but I'm sure we'll see just how useful it is in the coming weeks.

1 comment:

Danny Heap said...

The course notes have a proof of the master theorem. It only runs 2--3 pages, and it follows the same strategy as I used for mergeSort. You need to remember geometric series and a few facts about logs.