Friday, December 5, 2008

Problem Solving

I decided to work on the piles of pennies problem this year. Here is my approach to the problem.

UNDERSTANDING THE PROBLEM
We have two drawers with pennies in them -- a left drawer and a right drawer. At the beginning, the left drawer contains 64 pennies while the right drawer contains none. We have two possible operations. If the left drawer contains an even number of pennies, then we may transfer half of them to the right drawer. Also, if the right drawer contains an even number of pennies, we may transfer half of them to the left drawer. Neither of these operations hold if the right drawer or the left drawer contain an odd number of pennies.

The problem requires us to find a way to get 48 pennies in one of the drawers (left or right).

DEVISING A PLAN
The first thing I notice when I look at the problem is that there is nothing we can do if either drawer contains an odd number of coins. Since the total number of coins in the drawers is even (64) then eventually we could get into a situation where each drawer contains an odd number of pennies (odd + odd = even). In this situation, there is essentially nothing we can do and we are in a deadlock.

For this problem, there are a number of paths we can take to come up with a solution. We can try to start at the solution and work our way backwards to the initial conditions of the problem (ie. go from having 48 coins in one drawer and 16 coins in the other drawer to having all 64 coins in one drawer and none in the other) or we can do a simple guess and check. My preferred plan is to do a guess and check with educated guesses (guesses that make sense given the situation we end up in) as it's gotten me quite far in similar problems in the past.

CARRYING OUT THE PLAN
We start with 64 coins in the left drawer and none in the right.

L R
64 0 (only one possible move since halving 0 is pointless)
32 32 (doesn't matter what move we make since we always get the same result)
16 48 (done!)

LOOKING BACK
Overall, this was a simple problem to solve but much more complex problems can be derived from it. We can make the problem different by changing the only real constants in the problem -- the total number of coins in the drawers and the number of coins we are trying to get into a single drawer. Say, for example, we had 100 coins in the left drawer at the beginning. Since 100 cannot be written in the form 2^n for some natural number n, it will definitely not be possible to get any number of coins less than or equal to 100 in either drawer. We can also extend the original problem to determine if it is in fact possible to come up with piles containing every possible number of pennies (every possible number of pennies = [0, 64]).

CSC236 - Weeks in review

It's been a while since I last posted (my workload caught up with me) so I'm just going to recap the weeks I've missed in one long post. Without further ado, here they are:

Week 8: Iterative Correctness
Loop invariants, loop invariants, loop invariants. That's the name of the game. If you can find the loop invariant, you're pretty much done. At first, I didn't find the process of determining the loop invariant to be second nature but I've become a little better since then. I was almost able to come up with the one we used in class but I didn't come up with the decreasing set of natural numbers which happened to be a crucial part of the proof. Without the decreasing set, it would be impossible (maybe almost impossible) to prove the termination of a loop.

I missed the second lecture of the week so I missed seeing how Danny handled the tricky while loop. Maybe it's just me but, if I miss a class and look at the notes that Danny posts afterwards, it just looks like a lot of mumbo-jumbo. After a while of staring at it, it starts to look like some artist tried to create some abstract art using green paint on a black canvas. If I actually go to the lecture and look back at the notes to study for a test or something, even though the notes are just as unorganized, everything seems to flow more easily and I understand the material. Again, maybe it's just me but, after that experience, I tried my darnedest to get to every lecture for the rest of the year (and I think I succeeded; though I was late a couple of times).

And now onto nested loops. Looking back, I can't believe I though proving the correctness of a simple program was long. The length of those proofs pale in comparison to the length of a proof of the correctness of a program with nested loops! It's almost double the length because you have to prove the correctness of the inner loop before you even attempt to prove the correctness of the outer loop. It's tedious but it helps.

Week 9: Formal Languages
Coming into this week, regex was a foreign language to me. I had always planned to learn it sometime but I kept putting it off for a rainy day. So, it shouldn't come as much of a surprise that I was almost completely lost after this week. I eventually got back on track but what got me there didn't have anything to do with CSC236. It was in fact a CSC207 exercise on regular expressions that forced me to learn how exactly they work. Then, once I had completed the simple exercise, I tried to compare what we were doing in 236 to what I learned from doing the 207 exercise and something just clicked. I still don't have as good of an understanding of it as I would like but it'll have to do. Kleene stars were what kept throwing me off. For some reason, I thought that 0*1* meant that you could have any number of 0's and 1's in any order. In other words, I thought that it was the regex that represented the language containing all binary strings. Once I got that sorted out, everything started to come naturally.

Week 10: DFSAs
When Danny first introduced us to DFSAs, I knew right away that this was something I was going to like. It was almost like I was already using FSAs subconsciously to understand the regular expressions we were using in class. All I had to do was formally define all of the states and the transitions and I had my DFSA. We also covered algebra on regexes which was also pretty simple. All in all, an easy week in 236 was just what the doctor ordered considering all of the other work I had to deal with from other courses.

Week 11: NFSAs
We started out this week by learning how to prove the correctness of a DFSA. It didn't seem to be too difficult. Just use structural induction and prove the correctness of the state invariant. All we were asked to do in A3 and on test 3 was to come up with the state invariant (but not prove it). Not sure why that is because the proofs seem to be pretty straight forward. Definitely not something that should be out of our "comfort zone".

I wasn't really expecting that there would be any kind of link between NFSAs DFSAs and regular expressions so the fact that there exists an equivalent NFSA for every regular expression and an equivalent DFSA for every NFSA caught me off guard a bit. Looking over the logic, it does seem evident that it is true and it is a helpful fact. Especially since I try to visualize regular expressions as DFSAs. Now I can be certain that it is always possible to find a DFSA for any regular expression.

The Cartesian product came off as straight forward for me. My original guess for how to design a machine that evaluates R1 /intersect R2 was to create a new state to be a start state and have an epsilion transition to both of them (somehow) so they both are executed at the same time. Then, the machine only accepts strings that lead both the sub-machines of R1 and R2 to accepting states. This is still how I view the cartesian product (as a machine that runs through both R1 and R2 at the same time trying to reach their respective accepting states).

The last material we covered in this week was the fact that all DFSAs represent a regular expression. We determine this regular expression by taking out the states of a DFSA one by one until there are none left (other than a new start and end state added in before the process begins). What I learned from doing this in class is that, after a while, things could start to get complex. I was doing the process on paper along with Danny and trying to stay one step ahead of him all the way (so I could check what I was doing as I did it). It proved to be difficult to keep track of every single transition that took place in the original DFSA and what happened to them when states were removed. I found myself double checking what I had written down multiple times before being able to move on.

Week 12: Pumping Lemma
This week, we started out by learning the pumping lemma. I still haven't really gotten a hang of this lemma so I'm hoping to be able to go over it at least a couple times before the exam next week. What I can't seem to understand is how uv^kw is supposed to continue to pump out strings belonging to a given regular language. Especially if the language is a finite language. For example, take the language denoted by the regular expression (0 + 1). This language contains only two strings "0" and "1". Therefore, if v is not the empty string, how can uv^kw continuously produce strings in a language which only contains 2 strings? Hmmm..... Looks like I've got a bit of work ahead of me trying to figure this one out...

Week 13: CFGs
We finished off the term in style learning about context free grammars (CFGs). I really liked how Danny introduced us to CFGs with his sandwich generating grammar and his sentence generating grammar. It really made it easy to figure out how everything was related and how things were put together in the grammar. Even though they aren't really similar, the formal definition of a CFG reminded me of FSAs. Looking at the actual machines involved with the two, there is a bit of a relation. FSAs are used to find strings in a language whereas CFGs are used to generate strings in a language. Used in combination (or separately), they are powerful tools to use when trying to understand exactly what a "language" is.

Saturday, October 25, 2008

Week 7: Program Correctness

What I've learned this week is that proving the correctness of a program can be a long and enduring task. I did not expect either of correctness proofs we did in class to be anywhere near as long as they were. The content of the proofs were not complicated in nature but having to describe what happens at every line of the code in different cases seems like it could become tiresome. What happens if we start having to prove the correctness of a program that's 40 or 50 lines long instead of the 5 or 10 lines we do in class? I'm pretty sure Danny won't make us experience that unless the program was loaded with if/else statements so it'd be, in essence, a 5 or 10 line code. Well, here's hoping.

I know I usually write a little bit more than I'm writing this week but, after having written 3 midterms last week and having not started 2 or 3 assignments that are due next week, I need all the time I can get.

Friday, October 17, 2008

Week 6: The Master Theorem

Thanks to Thanksgiving, we only had one less lecture this week. The topics we ended up covering were the time complexity of merge sort, monotonicity, and the master theorem. The proof of the time complexity of merge sort was a little more dense than I expected it to be but, looking back at it, it was pretty straight forward. First you prove a conjecture about a couple of points (in our case, 2^k for all natural numbers k) and then use the principle of monotonicity to solve for everything between those points.

To me, proving monotonicity doesn't seem to be any different than proving that a given function is always increasing. Before Danny gave us the definition of monotonicity that we were going to use in a proof, I expected that we were just going to strive to show that T(n) always existed and was well defined for any value of n. I guess proving that T(n) is always increasing also does this while giving us another helpful fact (that T(n) is an increasing function) at the same time.

The master theorem was revealed to us on Friday morning and it was interesting to say the least. I wouldn't mind seeing a proof for it (though I doubt I'll be able to understand the whole thing and I expect it's probably one of the longest proofs I've ever seen) because I wouldn't know where to start proving something like that. It seems to be very useful but I'm sure we'll see just how useful it is in the coming weeks.

Saturday, October 11, 2008

Week 5: The Test

The term is rolling right along and we are now entering the midterm zone of the schedule. Over the next 3 weeks, I will be writing midterms in every single one of my courses. Some will be easy, others won't. But I just have to face them one by one and hope for the best.

The first of the midterms was the CSC236 midterm. In my opinion, it wasn't too bad. It was very fair and I thought it did a good job of testing the content of the first four weeks. So kudos to Danny for being able to merge 4 weeks of material into 3 questions. The one thing I didn't see there was the principle of well-ordering. Of the three "flavours" of induction, the PWO was the one that I was the least comfortable with. So, I guess that may have been a blessing in disguise.

The questions weren't very difficult in nature. I did not see all of the proofs right away (I actually don't think I saw any of the proofs right away) but after a little scratch work, the answer just appeared on my page. Hopefully that translates into success in terms of a good grade.

Friday, October 3, 2008

Week 4: Recursion

This week was all about the return of recursion. I honestly don't mind recursion. It's just another way to solve different types of problems. When I'm working on problems involving recursion, I like to try to trace the entire sequence in my head or on paper (well at least the beginning of the sequence) before I even start the problem. Occasionally, for complex situations, the recursion is hard to follow but I find that, most of the time, a simple (almost trivial) solution exists.

However, when we were working on the first exercise involving the Fibonacci sequence (where we had to determine what the sum of all Fib numbers from F(0) to F(n) was), I'll admit that I really did not see the solution. In my scratch work, I just tried to make a list of the sums from 0 to n for a few small values of n (ie. the sum for n = 1 is F(0) + F(1) = 0 + 1 = 1). That did not seem to lead me anywhere. Even after Prof. Heap drew up a table to record the values on the tablet, I really could not see any type of relation between F(n) and the sum of F(n) from 0 to n. When the relationship was finally revealed (where the sum of F(n) from 0 to n equals F(n+2) - 1), I felt like slapping my forehead because I was looking for a complicated solution where there was none. Sometimes, it hurts to overthink things.

Assignment 1 was due last Monday and I don't think I did too badly on it. The first question involving the sum of the angles in a polygon of n vertices was pretty straight forward. Divide a polygon of n+1 vertices into a triangle and a polygon of n vertices and apply the induction hypothesis. That took all of 5 minutes. Questions 2 and 3 took me a bit more time though. Especially trying to determine some type of recursive structure for the menus in question 2. Once again, I was overthinking the problem and the simple answer just passed me by. Hopefully I'll be able to kick the habit before the test. Question 3 didn't take as long. What confused me in the beginning was how to apply the principle of well-ordering to a question that involved no sets. I actually determined the answer to this question while lying awake in my bed early Saturday morning. It was one of those "Eureka!" moments. I'm hoping I'll be able to conjure up another one of those moments if I get stuck on the midterm next week.

Friday, September 26, 2008

Week 3: The First Entry

So I finally got around to making my SLOG for this course. I usually take a while to get started for some reason. Last year in 165, I didn't write up my first SLOG entry until just before the second term test (I could be wrong; it may have been even later!). I have to admit that, when I first heard of the idea of a course blog in 165, I wasn't really excited to get it going. I didn't think that I'd have enough material to cover or experiences to describe for a weekly blog. By the end of the year, I'd warmed up to the idea but it still hasn't completely grown on me.

Let's get back to the present time now. We're 3 weeks into the year and this course (CSC236) seems to be very similar to CSC165 in terms of the types of proofs that we are required to do. The simplified proof structure seems weird to me though. It's not any harder than 165 style of proofs but, in my opinion, the 165 proofs looked more visually pleasing than simple prose. I'd much rather read a bunch of small paragraphs or lines than a long and tedious paragraph. But maybe that's just me.

The first two problem sets were easy enough. The only problem I had with problem set 1 was submitting it. My worst TTC experience to date caused me to be 40 minutes late for class. It all started when I watched the bus I had planned to take leave me in the dust when I was literally 5 steps from the bus stop. Then I decided to just wait for the next bus which was scheduled to come in 15 minutes. 30 minutes later, no bus in sight. I'll skip the rest of the details mostly because it all seems like a blur now. Oh well, time to go finish off assignment 1.