Sunday 30 March 2014

Week 11: Sorting and Efficiency

Here is this week's prompt for a slog entry: "You should explain what you've learned from readings and lectures about how to express the time complexity of some common sorting algorithms. Without regurgitating other sources, you should demonstrate some understanding of the characteristics of the different algorithms we've looked at."

Efficiency, which I briefly discussed in my last entry, is a crucial aspect of code. Generally, programmers consider an algorithm's worst case, best case and average case when thinking about the efficiency. This helps improve the algorithm and prepare for every extreme. If the time it takes to run an algorithm is constant and independent of "n" (size of input), its is noted as O(1). If the time it takes to run increases proportionally with respect to n, then it is noted as O(n). If the algorithm splits the problem up into parts, its usually O(lg n). This type of run time is usually very quick and efficient.  Algorithms with O(n^3) or O (n^2) are considered quite inefficient. There are many types of running types, some of which I semi- explained. There is constant, logarithmic, linear, quadratic, cubic, exponential and even worse. We generally consider the worst case when trying to assess what the big-Oh of an algorithm is. 

We learned about sorting functions in lecture as well. In CSC108, I had been introduced to some sorting methods including, bubble sort, selection sort and insertion sort. In lecture I was introduced to merge sort, count sort and the built-in sort method. All of these sorting techniques gets the same job done but with different efficiencies, which becomes more and more important with increasing size of what needs to be sorted. 

Selection sort finds the smallest element and swaps it with the first element until the list is fully sorted. 

Insertion sort iterates through a list and compares each item to the one before it until it can find the right place for that item and inserts it there. This continues until the list is sorted.

Merge sort is a recursive algorithm that breaks down a list into sub-lists and recursively sorts them using a base case of a list with one item. It than takes these sublists and puts them together to make one sorted list. 

Quick sort is another recursive algorithm. It chooses a pivot and puts items less than the pivot on the left and the items more than the pivot on the right. It recursively moves into the left and right with new pivots. On the left, the pivot will be smaller and greater on the right.  This continues until a list with only one item is found, which is the base case. At this point the list is sorted.

Saturday 22 March 2014

Week 10: big-Oh

The efficiency of an algorithm you write is obviously something important to consider, especially when it comes to dealing with large quantities amounts of data. It is therefore important to be able to assess what the efficiency will be. In class we learned how to do this.

 By considering the input size as "n",  and the efficiency would be how many steps the algorithm must go through with respect to n. If a function takes 2 steps for each item in the input, then it would be 2n. It can get much much more complex than this though. Computer scientists primarily focus on the big-Oh which is the part that affects the efficiency the most.

I feel as though its impossible to stay up to date with this class as well as all my other courses. With the lectures, lab, assignments, exercises, this SLOG and term tests, I always have some compsci to do but I can never feel like I can sit down and really take a while to take in a concept. Hopefully some time will come my way sooner than later.



Sunday 16 March 2014

Week 9: BST's and the big-What?

I have to say CompSci lab's are sort of saving me since they are the only time where I can actually see how to apply what I'm learning in a setting with support from someone who actually knows what they're doing as well as the helpful collaboration with my partner. I thought this week's lab was good because it forced me to think through different ways to write code in order to do the same thing.


In class we have been learning about binary search trees. These are basically the same as trees that we learned about before but with a more organized structure. There is a root with an value, to the left is all values that are less than the root and on the right is all values greater than the root.  This is the same for each node in the tree.


BST's are most helpful when it comes to sorting, searching or putting the tree in-order. Since there is a set structure of where values will be, it is more efficient for these types of functions. It helps me to think that there is a yes or no question to be asked at each node, is it larger than this item? Thinking in this way makes code more understandable and easy to write. These "yes or no" questions allow virtually half of the tree to be eliminated when trying to find an item in a tree. It is clearly an important structure and powerful when used well.

I didn't understand the big-OH very well this week but my plan is to figure that out the best I can on my end.  I might have some questions to pose here later if I can't do that.

Until next time...


Saturday 8 March 2014

Week 8: LinkedLists and such

Well CompSci isn't going as well as I would like it to. It isn't that I don't understand the material presented in class and when I read code, I understand it. I also understand the concepts presented to us, such as the way data structures such as trees and LinkedLists are set up and used. However, when it comes to actually writing code myself and thinking visually of these concepts, I am definitely struggling. Sometimes I chalk it up to "not being a CompSci person" but other times I feel as though I'm just not putting enough effort in. Either way, something's got to change before my grade dips any lower.

In lecture, we've learned about LinkedLists. These are data structures that are recursive in nature and seems to be quite similar to trees. I do not really understand the difference in the two ways to represent them though. The "wrapper approach" uses self.size as well as self.front, but I can't really see a benefit to this. Maybe I've missed something. If anyone wants to comment further explaining the difference between these two representation types, it would be appreciated.

LinkedLists have a head, which holds the data for the node, and a pointer to another node, the tail. Almost all functions we did with this type of structure was recursive, because LinkedLists are recursive on their own. From what I can tell, this is a useful datatype because it can grow in different ways and it can be as large as you want it to be.


Even though I am getting almost completely comfortable with recursion, sometimes I find it hard to think about how recursion would work when trying to write a function. I found a site that helped with practicing the basics of recursion to some more complicated examples. It is called CodeAcademy and it might come in handy for any fellow programmers-in-training.

That's all for now, except to say assignment 2 part 1 wasn't really all that bad. I was surprised at how much I could do.

Sunday 2 March 2014

Week 7: Recursion

Learning the technique of recursion has made up a hefty amount of CSC148 so far. Upon hearing what it was for the first time, a function that calls itself within its own code, I thought it seemed relatively simple and I did not see much strength in it as a technique for programming. Essentially, a recursive function defines itself in terms of itself, something that I've always thought was not allowed, and valueless. Yet after many, many examples in lecture and labs, I can now see both it's complexity and importance. Although I say "complex", code is actually simplified with the use of recursion by breaking it up into sub-problems and running it all within one function one time. The concept of recursion, however, is more complex to understand than I first realized.

At first, I thought recursion was basically a while loop. As in it would keep calling the function until it had completed a loop through. Just like a while loop, if not implemented correctly, a recursive function will run infinitely. However, when it is controlled to run by using a base case, recursion works very well. The base case is virtually the function's "way out" and prevents it from just being circular by returning something.

Recursion, from what I can tell, is never completely necessary. There are ways to program without using it, mostly through for and while loops, yet sometimes it seems natural to use a recursive function because it can break up a problem into smaller pieces. Basically, the solution to the problem using a recursive function is dependent on the solution to smaller sub-parts of that problem as it moves closer to the base case.


Recursion is a good way to write code because it is more compact and is able to run through a larger problem without very complex code. Recursion divides and conquers a problem and returns the solution to all of those parts together. I definitely can now see the importance of recursion and how powerful a technique it is in Python. 

Saturday 15 February 2014

Week 6: Trees and Recursion

Assignment 1 actually went surprisingly well. After reading the sheet for the first time, I definitely considered dropping this class because I felt like I'd fallen behind too much. I was feeling like compsci and I hadn't really gotten close enough for me to complete something like that. However, after talking out the assignment with my partner- I realized what the overall idea for the program was. After getting a clear thought process in my head and thinking about how the data structures would work for stools and such, I could see what code was needed. Step 5 did take a bit of time, sweat and tears but I think the assignment went quite well. Seeing recursion work in a program that I partially designed was fulfilling as well as educational.

This week in lecture we went over trees, which use recursion. I found learning about recursion in this way quite helpful. I don't really understand what the value of learning about trees is, or how they can be used but I'll definitely be finding that out in the future.

The lab went well. My partner and I went through the steps quite quickly since we both seem to be understanding the application of recursion much more soundly than before. My one problem is that I don't always think to use recursion if I'm not told (such as in lab) "solve the problem by writing recursive code". I hope to eventually get to the point where I can see that I need to use recursion. I am able to trace the recursive code in my head and I understand how it works, I just need to work on realizing when it is useful in code.

Well, that's all my thoughts on compsci for this week. Comments are welcome.

Sunday 9 February 2014

Week 5

I thought the lab went quite well this week. I struggled a bit with the initial concept, but once I could visualize what the function was meant to do, I could figure out the code with my partner. 

The assignment is definitely challenging. I should have started it early. Especially since I feel as though I'm not quite as good at coding as I should be for such an assignment. I definitely have some practice and catching up to do before the midterm later this month. 

I'm finally getting comfortable with classes and subclasses though, and I get the overall idea of inheritance and how it makes code simpler and less repetitive.