Wednesday 3 December 2014

Extra Entry; Comments

Slogs I have commented on:


  • http://vyytancsc165slog.blogspot.ca/2014/11/functions-are-funny.html?showComment=1417660608374#c4065434265191836145
  • http://vladcslog.blogspot.ca/2014/12/week-12final-finallyit-is-all-over.html?showComment=1417660785839#c3217143780510388678
  • http://holakittyxu.blogspot.ca/2014/11/week-9.html?showComment=1417660979023#c3040742420868360153
  • http://hopalongaslog.blogspot.ca/2014/11/week-9.html?showComment=1417661143458#c649603759792557716
  • http://abhinavchawlacsc165.blogspot.ca/2014/11/week-9.html#comment-form
  • http://99bugsbutaglitchaintone.blogspot.ca/2014/11/week-12-one-last-slog.html?showComment=1417661326279#c4269938467694243043

Math Problem Solving Entry: Penny Piles

Well guys, it's a little late but here I am bringing you the problem solving part of my blog, involving the problem that we addressed during the last lecture of week 7. You guys know it by the name Penny Piles. Let me give you the breakdown of How I solved this problem.

Summary
You are sitting in front of two drawers. The left drawer contains 64 pennies, the right drawer contains nothing. Can you arrange things so that one of the drawers has 48 pennies, using the following two operations:
L
If the left pile has an even number of pennies, transfer half of them to the right pile. If the left pile has an odd number of pennies, operation L is disallowed.
R
If the right pile has an even number of pannies, transfer half of them to the left pile. If the right pile has an odd number of pennies, operation R is disallowed.

Choose another number in the range [0, 64]. Starting from the same initial position, can can you arrange things such that one of the drawers has that number of pennies? Are there any numbers in that range that are impossible to achieve?

Understanding the Problem

  • What is given: that the left drawer contains 64 pennies and the right drawer contains no pennies. Also we are given two methods of transfer from left to right (L) and right to left (R). These methods also contain rules that one must abide by.
  • What is needed: We need to find the correct combination of Ls and Rs in order to successfully ensure that one drawer has 48 pennies.
Devising a Plan
  • Plan 1: keep track of the potential penny transfers in a list/tree, starting with the ordered pair (64,0) representing (pennies in left drawer, pennies in right drawer) and start implementing Ls and Rs and keep track of the result. Record the process of reaching the point where one drawer contains 48 pennies
  • Plan 2: Look at the hint page and proceed from there
Carrying out the plan
After carrying out plan one I discovered that there are two ways to achieve having 48 pennies in one drawer. These methods are implementing method L 2 times in order to have the right drawer contain 48 pennies, and implementing the method L and then the method R in order to have the left drawer contain 48 pennies.

Looking Back
Looking back the main problem did not take too long to solve seeing as only 2 steps needed to be taken to achieve the desired result. So one could say you could technically solve this problem just by eyeballing the situation and coming up with the solution in your head rather than writing things down. The result can be checked as we can implement the methods in reverse in order to achieve the original result.

Additional Problem
In terms of the additional problem of attempting to find any number in the range [1, 64] that is impossible to achieve in terms of pennies by using the two methods listed, I find that to be impossible. The methods L and R can be used to achieve any one of those numbers in terms of pennies in a drawer. All even numbers can be achieved because the combination [62, 2] can be reached by using L and R, This means that if 2 can be achieved, then any multiple of 2 can be achieved by using these methods. In terms of odd numbers, since all even numbers can be achieved, the even numbers that can be halved into the same odd number times 2 can be used to both achieve these odd numbers as well as create other odd numbers in the range [1, 64] by using L or R. For example, 34 can be used to achieve 17. 54 can be used achieve 27, etc. [23, 41] can be achieved by going from [64, 0], [32, 32], [48, 16], [56, 8], [28, 36], [46, 18], and finally [23, 41].

Week 12: Computability and Countability Continued

Well the time is here guys, it's the last week of official classes is here. This also means that this is the last weekly report that I will be making. It has been an adventure making all these blogs and even though it is a required part of the CSC 165 curriculum, I will admit that I had fun making some of these entries. So, let's make this last weekly roundup a good one. This is Kyle Mendoza checking in for one last time to bring you guys a weekly report for week 12 of CSC165. During this week we continued our studies on computability and countability, or lack thereof in our case. We focused on identifying non-countable sets and non-computable functions more than anything else this week, starting with explaining why the set of natural numbers and the set of rational numbers are countable while the set of real numbers is not. We also briefly went over proof by induction to wrap up the week and the term. So with all that said, let's get into it.

During the first part of the week, we continued learning about countability and now addressing why certain sets are not countable. We discovered that for a set to be countable it also had to be listable, meaning that each value of the set had a position that would correspond to some element of the set of natural numbers. Cantor, a mathematician, proved that real numbers could not be listable and thus, were not countable. Cantor essentially stated that no matter what you do to try and list the real numbers between 0 to 1, you will always omit an entry if you take '0.' and traverse the diagonal adding 1 to the digit if the digit is less than 5 and subtracting 1 if the digit is 5 or greater. This proved to be true as multiple entries was seen to be omitted. This means that the set of real numbers from 0 to 1 is larger than the set of natural numbers, which is mind blowing. It took me a lot of re-reading and answers to questions provided by my peers in order for me to fully understand this idea. Next we explored diagonalization, an idea that can help identify functions that are not computable. Essentially, list the infinite countable python functions as well as whether they halt or loop when passed into each other as arguments and traverse the diagonal of the result, toggling the values from halts to loops and loops to halts and create a hypothetical function that cannot exist on such list. Finally we wrapped up by exploring proof by induction, another method of proof that we can use in our proof structures. I'm not sure if I will be using this proof in the final exam, simply because I don't completely understand it, but who knows, maybe after studying it prior to the exam I'll find that it makes a lot of proofs a lot easier. With that being said, there's nothing much I can say about induction other than it's just another weapon I can add to my arsenal when tackling proofs once I learn how to use it effectively and efficiently. To end the week we finished with a problem solving session about finding the number of squares that a diagonal can pass through in a rectangular grid.

So, this is it guys, this is the end of this blog entry, and the end of this term. I personally found this last week's concepts to be more on the challenging side, but a lot more manageable than week 11. Comment below on how you guys found this week's concepts, and let me know your solutions for the problem solving session. Let us compare and contrast our answers. As always words are hard, comp-sci is and always will be awesome, and later days. Good luck on your exams everybody. This is Kyle Mendoza signing out.

Week 11: Halting Problem and Computability

Hey guys, how are you all doing? Hope you did better than me during this week 11 because I was confused on many occasions during this week. But there's no use in telling you this week was hard without telling you why is there? So as always this is Kyle checking in to bring u guys a week round up of week 11 of CSC165. With only a week remaining before the end of classes I thought I had all the concepts of this course figured out and that I was ready to do well on the final. Then I got introduced to halting and computability. I think this might have been the hardest week for me personally, just because I've never felt as lost during this course as I did this week. With that being said, let's get into the details.

This week didn't start off that badly, so it wasn't completely overwhelming. We started off by simply identifying that not all problems have an algorithm that can be used to solve them and looked at examples of such problems. This was simple enough because we looked at a python code that when executed would not return control to the user. This was relatively easy to understand because I am familiar with python. We then found that there was no way to implement a function that could return true when another function passed in as a parameter would halt and false otherwise. If this was to be used in another function that used function f as a parameter and would pass if the halt function on function f returned true and return 42 if the halt function returned false the result would be that this function would only halt when it didn't halt, which is a contradiction. To prove the non-computability of a function, proof by contrapositive would have to be used. This proof almost always involved the halt function as part of the implication. Unfortunately to this day I still don't completely understand how to completely prove the non-computability of a function. I had gotten the structure down after looking at the course notes but more studying is needed, possibly looking at sample proofs for non-computability. We wrapped up the week by talking about countability, and how a set of numbers could only be properly counted if the counting was 1-1 and onto, to ensure there was no over counting and under counting, respectively. We also found that two infinite sets of numbers are not always equal, meaning, strangely, that infinity does not equal infinity all the time. the size of two sets can be different even if both containing more elements than any number. I was able to grasp this concept after closely listening to Danny along with rereading the annotated slides when they were posted, though it took quite a while. We also found that some infinite sets like the natural numbers, even natural numbers, and rational numbers were countable because could be counted 1-1 and onto. Finally, we ended the last lecture with an intriguing statement, that not all sets are countable, and that a mathematician named Cantor would show this using diagonalization, But that would be for next week.

Well I think that might have been the one of the longest wrap-ups I have ever done for this course. As you can see, it took a lot for me to even mildly understand these concepts and there is still much work to be done. But with only 2 weeks till the final exam it definitely is crunch time. Of course, just because I personally found this week difficult doesn't mean you all did. Comment below how you felt about this week and if I am over-complicating these concepts and I should just relax when studying computability and countability. Also, let me know how all of you are feeling now that the term is almost over. Its been a crazy week, but that's to be expected. As always, words are hard, comp-sci is awesome, and later days. This is Kyle signing out.

Week 10: Big Omega, Big Theta, General Properties about Bound Proofs, and a Little about Limits

What's up guys, this is Kyle here checking in for a weekly wrap up of week 10 of CSC 165. Finals are drawing near for this course as there are only 2 more weeks of class left. Hope everyone is following along so far. This week we covered quite a bit of new concepts, it was definitely a little bit overwhelming to wrap my head around. We started off covering how to incorporate limit laws and L'Hopital's rule when proving Big O. We then moved on to learning how to prove Big Omega, which in all honesty, wasn't that difficult because of the similarity to the proof structure of Big O. This was followed by learning how to write a new kind of bound proof which was Big Theta, which essentially combined the pieces of Big O statements and Big Omega statements into one statement. Finally, to round up the week we went over some general properties about upper bounds and lower bounds. So, now that I've set up a stimulating intro, let's get into the juicy details.

We started the week off by going over some week 9 things that we did not get to tackle during the actual week 9, which focused on using limits as an option to solve proofs about Big O of certain functions. This technique was mainly used to be able to prove any Big O statements involving exponential functions. This is because there is a statement about limits of the rationalization of the two functions involved in the proof as x (or n) approaches infinity that closely mirrors the statement about Big O in terms of structure. This can be used to help us write a proof on the Big O of these particular functions. This was initially hard to understand at first, but because this concept needed to be used to complete assignment 3, I looked into the course notes as well as the annotated slides in order to fully grasp this idea, and it worked. I now feel completely comfortable with using limits to solve Big O proofs. Next up, we tackled writing proofs about Big Omega and Big Theta, which basically had the same proof structure as Big O except for one difference. In the case of Big Omega, instead of a function f(n) being < (c) g(n), it was now f(n) > (c) g(n). Big Theta essentially combined these two statements to say there was a (c1) g(n) < f(n) < (c2) g(n), essentially saying that g(n) could be bounded above and below f(n). Like I stated before, because this concept was very similar to Big O, it wasn't at all hard for me to understand it right away. Finally, we ended the week by going over general properties about Big O and Big Omega of multiple functions, and how to tell without looking at the actual function whether the statement was true or not. An example of this is that we were able to prove that for all functions f that can be bounded above by g, the square of f can also be bounded above by the square of g. Once again the proofs for these concepts were very straightforward and didn't cause any headaches for me.

To conclude it all, this week was packed full of information, but thankfully all the information was easy for me to understand. If even one of the concepts proved to be difficult for me to grasp it would've definitely be a excruciating week. As it was it was pretty challenging not because of the difficulty of the concepts but simply because of the amount of information we were taking in. But what do you guys think? Did you think we learned a lot this week? Were you all able to follow along or did you lag behind at some parts? Let me know in the comments below the answers to these questions as well as any other things you want to say. Let's start a discussion. Anyways, that about wraps up this report, as always, words are hard, comp-sci is awesome, and later days. This is Kyle signing out

Week 9: Addressing Maximum Slice and Provng Big O of Polynomials

What's up guys this is Kyle checking it to give you guys a weekly report for week 9 of CSC 165, This week we focused on two main concepts. One was a concept we previously learned about that was being applied to a very challenging context, and one was a new concept that would help us in the later weeks. So without further ado let's dive into this.

The first concept we addressed was the finding the sum of steps of a function called max_sum, which returned the maximum sum over slices of a set L. Keep in mind L can be any generic set. Now for this problem we had to take a little bit more time to find the number of steps to execute this function in the worst case simply because there were two nested loops, and step counting gets increasingly difficult when multiple nested loops are introduced because the number of steps increase exponentially when a nested loop is introduced. To solve this problem Danny separated this problem into three different parts in order to address each loop one by one before coming up with a way to produce the summation of all the loops. This process was and still is hard for me to understand, and I think I definitely need to read it over a couple more times and either ask Danny or my other peers to help me understand the solution if I can't understand it after multiple reads. The second concept we learned about was proving if certain polynomials were in big O of other polynomials. We learned about which functions could be bounded above others and how the proof structure would look like. What I found was that in order to prove a function was in big O of another function, some underestimation of the original function needed to be made in order to compare it to the function it was in big O of. We also learned an important rule that a polynomial function cannot be bounded above by a polynomial function with a lower degree than it. Such an occurrence would immediately require a disproof of the statement. This concept I found I understood immediately and I'm glad that I did because I would hate to have added another idea to the list of ideas I needed to look back on and study more closely.

Overall this week was pretty balanced. It started off pretty intense for me but it mellowed out and stayed with something I felt comfortable doing, which was big O proofs with polynomials. Anyways, let me know what all of you guys thought of week 9 of CSC165 we're definitely getting closer to the finish line, time is flying. Be sure to comment your thoughts on this week's topics and anything else you want to discuss. My apologies for this going up super late, I honestly just got behind on doing these SLogs so I'm trying to pump all the ones I have left to do now, thanks for the understanidng. As always, words are hard, comp-sci is awesome, and later days. This is Kyle signing out.

Tuesday 11 November 2014

Week 8: Counting Steps, the Worst Case, and the Big Oh proof

What's up everyone, this is Kyle checking in for a week 8 roundup and discussion. This week, we dived fully into the world of sorting and its applications to the logical and mathematical reasoning of CSC165.

We started out by looking at the number of steps it takes to execute a linear search for any list of length n. Calculating the number of steps it took to execute code was relatively simple at first, but counting steps for nested loops was a bit more complicated and it took me a while to actually understand how it was done. Even now I am not completely comfortable with the method, but through reading through the annotated slides and trying to find my own way of expressing how to calculate the number of steps it takes to execute different searches. I did find, however, that the easiest search method to count the steps of was the linear search, which, compared to many other searches such as binary search, has a rather long run time. After that we learned about the worst case of all these searches, which is the greatest possible number of steps it takes to execute the search for a list with length n. We explored the upper bound and the lower bound of the worst cases, also known as big 'Oh' and big Omega, respectively. Finally to round off the week, we learned how to do proofs involving the Big Oh or upper bound. At first I was confused how we would prove something like that, but after seeing the equivalent statement of a big Oh proof, it became much clearer what we had to do. It took me a while to understand how we were to find a 'B' and a 'c' to satisfy the big Oh for different searching methods, but after a bit of practice with simpler examples of code I was able to understand it.

Well, that's a wrap for this week 8 roundup, let me know if how you guys found this week's lectures, and if it prepared you for the term test that came around in week 9. Also, comment below how you all did in assignment 2, were you able to solve those difficult e, d proofs? (1.1 and 1.2) As always guys, words are hard, computer science is awesome, and later days. This is Kyle signing out.