Base Case:
T(n) ≤ a constant c, for all sufficiently small n
T(n) ≤ aT(n/b) + O(nd)
- a is the number of recursive calls
- b is the factor by which the problem size shrinks. It must be greater than one or the problem will never get any smaller
- d is the exponent of the combining time. It can be as small as 0, which indicates it takes linear time.
a, b, d must all be constants and independent of the input size.
So the master method is:
In the first case, the base for the logarithm doesn't matter because a base will only change the amount by a constant factor. However, in the third case, the base matters a lot because it could make the difference between linear and quadratic time.
Let's look at MergeSort as an example. We know T(n) = 2T(n/2) + O(n). We have a=2, b=2, and d=1. a=bd so we are in case 1. The running time will be O(ndlogn) which is just O(nlogn).
For binary search, we split the problem into two with n/2 input size each BUT we only work in one of them and just compare the element you're searching for with the center element, which takes constant time. Thus, we have a=1, b=2, d=0 and a=bd so the overall runtime will be O(logn).
So the master method is:
In the first case, the base for the logarithm doesn't matter because a base will only change the amount by a constant factor. However, in the third case, the base matters a lot because it could make the difference between linear and quadratic time.
Let's look at MergeSort as an example. We know T(n) = 2T(n/2) + O(n). We have a=2, b=2, and d=1. a=bd so we are in case 1. The running time will be O(ndlogn) which is just O(nlogn).
For binary search, we split the problem into two with n/2 input size each BUT we only work in one of them and just compare the element you're searching for with the center element, which takes constant time. Thus, we have a=1, b=2, d=0 and a=bd so the overall runtime will be O(logn).
No comments:
Post a Comment