Wednesday 2 April 2014

sorting and efficiency

Sorting is an algorithm that puts elements in a list with a certain order. (Wikipedia)  Different sorting algorithms has different efficiency, and the efficiency of a certain sorting algorithm can vary among best case, average case, and worst case.  Efficiency is important when choosing sorting algorithms. In this blog I will list some sorting algorithms'  efficiency.  For bubble sort, the best, avg,and worst case efficiency would be O(n),O(n^2), O(n^2); for insertion sort, the efficiency is the same with that of  the bubble sort; for merge sort, the efficiency of  best, avg, and worst cases are all O(n logn); for binary tree sort, its' worst and avg efficiency is O(n logn), while its' best case is O(n); for quick sort, the best and avg cases are both O(n logn), while its' worst case is pretty horrible---O(n^2). Thus we can see that bubble sort is a simple but highly inefficient sorting algorithm, thus this algorithm would be rarely used if the data is not small or sorted.  Comparing with bubble sort, insertion sort is more efficient when applied to small and mostly sorted lists. For large lists, merge sort can be a good choice since it's worst case is O(n logn), and it's easy to apply to lists, as it only requires sequential access, not random access.  Choosing an appropriate sorting method would be kind of difficult since there's a lot to think about such like data size, sorted or not , etc. Thus probably we would be more  familiar with that when we had done more codingin the days later. 



Sunday 2 March 2014

week 7 recursion

The midterm storm has finally passed, and I now have sometime to write something about recursion in python programming at last. Recursion , which is a way of coding that a function can call itself  one or more times, is said to be the most magic and important concept in python. From my perspective, recursion  is a method that solves a problem by working on smaller instances of the same problem. It is an easy and elegant way of coding in my eyes, since a problem can be solved by using only a few or even one line of code, which is really amazing and attractive to me, since I am always  too lazy to write loops. However,  according to a friend of mine who studies ECE, recursion may not be as efficient as loop in some cases, which made me pretty curious and I did some google research. According to my research, what my friend said is true. From the website I read, they use an example of Fibonacci sequence. The recursive method they write would be as follows:
Def fib(n):
       If n==0:
             Return 0
       elif n==1:
             Return 1
       else:
             Return fib(n-1)+fib(n-2)

Also, they give an iterative way as below:
Def fibi(n):
       a,b = 0, 1
       for i in range (n):
             a,b =b, a+b
       Return a 


After that they write a script where use the timeit module to measure the speed of both ways of calls. The results is pretty surprising as fibi(40) is 13millions times faster than fib(40), and that's due to no menmory of previously calculated values for resursion according their analysis by using tree structure. But this pitfall can be solved by using a dictionary to save previously calculated values. The website I browsed is www.python-course.eu/python3_recursive_functions.php, where you can find the code and analysis results above. Also if you are interested you can check the website for their timeit data and tree structure analysis, which also does a little help for our tree knowledge.