Logarithms
We review some properties of logarithms and then see why it could be useful in CSE 331.
-
This can be proved by induction and is left as an exercise.
-
If we do not specify the base of a logarithm, it is always to base 2.
Definition
We begin with the definition of logarithm:
Logarithm
We call a number b to be logac (where a,c>0) if and only if ab=c.
In this note, we briefly present some properties of logarithms. For a more comprehensive tutorial on logarithms (and their properties), take a look at this Khan Academy Tutorial .
The first implication
We now consider one implication of logarithms, which will be used a few times in CSE 331:
Lemma 1
Let n≥1 be an integer and let x≥0 be the smallest integer such that n2x<1.
Proof Idea
We will use the definition of log2n and the fact that 2x is increasing in x.
Proof Details
By definition we have 2log2n=n.
But why should I care?
It turns out that Lemma 1 is useful in proving asymptotic bound in runtime analysis. For example consider the well-known binary search algorithm:
Binary Search
- //Input: A[0],...,A[n-1];v (A is sorted)
- left=0;
- right=n-1;
- while(left <= right)
- {
- mid=ceil( (right+left)/2);
- if(A[mid]==v)
- return mid; // v is present
- else if(A[mid]< v)
- left=mid+1; // Search for v in right half
- else
- right = mid-1; // Search for v in left half
- }
- return -1; // v is not present
It turns out that Lemma 1 is crucial in proving the following well known bound on runtime of binary search: i
Theorem
Binary search runs in time O(log2n).
Proof Idea
The runtime of binary search is dominated by the number of iterations of the while
loop in line 6. We bound the latter by O(log2n) with the help of Lemma 1 (by observing that the quantity right-left
decreases by at least a half in each iteration).
Proof Details
We first note that all steps outside of the while
loop take O(1) time. Further, each iteration also takes O(1) time (since each step in the iteration are basic operations such as comparison, assignment and simple arithmetic operations). Thus, if we can argue that while
loop runs O(log2n) times, we would be done.
To bound the number of iterations of the while
loop, let ℓi and ri denote the value of left
and right
at the start of the ith iteration. Further, we keep track of pi=ri−ℓi. Note that p0=n−1 and for the smallest integer x such that px<1, the algorithm will terminate at the end of the xth iteration. This is because at the start of this round left<=right
but then if line 10 is not executed then either line 13 or 15 will ensure that at the start of the next iteration left> right
and hence the algorithm will stop. (Note that if line 10 is executed then the algorithm will also terminate.)
Finally we argue that x=O(log2n). Towards this end consider the ith iteration for some i≤⌈log2n⌉. Then either line 10 is executed (in which case the algorithm terminates and we are done) otherwise either line 13 or 15 are executed. In the latter case we claim that pi+1≤pi2.
The base change formula
The following well-known result also turns out to be useful:
Base Change Lemma
For a,b,c>0, we have logba=logcalogcb.
See e.g. this Khan academy video for a proof:
The base change lemma implies the following result (which is left as an exercise):
Exercise 1
Argue that for any constant base b, logbn=O(log2n).
The above allows us to use the following convention in CSE 331 (which guys have followed so far in say CSE 250 too):
Convention
Since for all constant b, O(logbn) is O(log2n), we will denote O(logbn) for any constant b by O(logn), i.e. we will not explicitly state the base since asymptotically they are all the same.
But why so we need another base?
It turns out that sometimes we do need to consider logarithms to bases other than 2. For example consider the following variant of the binary search algorithm, where c
is a constant strictly greater than 1:
Modified Binary Search
- //Input: A[0],...,A[n-1];v (A is sorted)
- left=0;
- right=n-1;
- while(left <= right)
- {
- mid=ceil( (right+left)/c); // Note that c has replaced 2
- if(A[mid]==v)
- return mid;
- else if(A[mid]< v)
- left=mid+1;
- else
- right = mid-1;
- }
- return -1; // v is not present
It turns out that using c to define the mid
instead of 2 does not change the asymptotic runtime:
Exercise 2
Prove that the modified binary search runs in O(logn) time.
Hint: Modify the arguments for the Theorem, Lemma 1 and use Exercise 2.
Another use of logarithms
One can also use logarithms to compare running times. Towards this end, the following simple observation is useful:
Lemma 3
For any a,b>0, we have a=blogba.
We illustrate the usefulness of Lemma 3 with an example.
Which runtime is better?
Say I tell you there are two algorithms that solve the same problem but on inputs of size n, one runs in time T1(n)=nlogn while the other runs in time T2(n)=(logn)n. Which one should you choose?
Before we proceed to answer the problem, above we state another property of logarithms that is useful:
Lemma 4
For any a,b,c≥0, we have logb(ac)=c⋅logba.
See e.g. this Khan academy video for a proof:
Now let us return to our problem. One might think that since the base n is larger than the base logn, T1(n) grows faster than T2(n). And one would be wrong. To see this, use Lemmas 3 and 4 and note that T1(n)=2log2n and T2(n)=2nloglogn.