- Published on

# Recurrence relation - when division strategy differs based on whether n is even or odd

- Authors
- Name
- Ajay Karthick Senthil Kumar
- @i_ajaykarthick

The idea of divide and conquer is quite appealing. We divide the problem into subproblems that are smaller instances of the original problem, then solve them recursively, and finally, the solutions to the subproblems are combined back to form the solution to the major problem.

When an algorithm contains a recursive call to itself, we can describe the running time by a recurrence equation.

Often, divide and conquer algorithms divide the problem into subproblems by either multiplying by a fraction (for example, $n \times \frac{1}{2}$, $n \times \frac{2}{3}$) or subtraction (for example, $n - 1$, $n - 2$). Here, we are going to see an interesting case when an algorithm alters the strategy to break the problem into subproblems based on whether the size of the input is odd or even.

Pseudo code of this kind of algorithm may look like:

```
function f(n) {
if (n == 0) return 1
if (n % 2 == 0) return f(n / 2) + 1
else return 2 * f(n - 1)
}
```

Here, we are not interested in the output of the above function but the time complexity of these kinds of algorithms and especially the approach to solve the recurrence equation for algorithms that follow this kind of pattern.

The recurrence equation for the above function can be written as follows:

It means whenever the value of $n$ is greater than 0 and divisible by 2 (when $n$ is an even number), the function recursively calls itself with half of the value of $n$, and whenever the value of $n$ is greater than 0 and not divisible by 2 (when $n$ is an odd number), the function recursively calls itself with $n - 1$.

$n$ can possibly be any integer greater than or equal to zero.

- When $n = 0$, it takes a constant amount of operations.
- When $n$ is odd, the algorithm reduces the value of $n$ and calls recursively with $n - 1$. As $n$ is an odd number, it is guaranteed that $n - 1$ is an even number.
- When $n$ is even, the algorithm reduces the value of $n$ by halving it and calls recursively with $\displaystyle\frac{n}{2}$. With the evidence of $n$ being even, we cannot conclude that $\displaystyle\frac{n}{2}$ is odd or even. It can be either even or odd. We need more information about $n$ to determine whether $\displaystyle\frac{n}{2}$ is odd or even. Since we are dividing $n$ by 2, whether $\displaystyle\frac{n}{2}$ becomes odd or even depends on $n$ being a power of 2.
- When $n$ is a power of 2, i.e., $n = 2^k$, it is obviously an even number and $\displaystyle\frac{n}{2}$ is definitely an even number for all $n > 2$.

So, we can conclude that when $n$ is a power of 2 and greater than 2, i.e., $n = 2^k$ and $k > 1$, $\displaystyle\frac{n}{2}$ is even; otherwise, $\displaystyle\frac{n}{2}$ may be odd.

So, there are four cases for $n$:

- $n = 0$
- $n$ is even and a power of 2 $\Rightarrow n = 2^k \quad \forall \, k > 1$
- $n$ is even but not a power of 2
- $n$ is odd

### $n$ is even and a power of 2

Case 2:Let $n = 2^k$ where $k > 1$.

We have:

By applying this recursively:

Continuing this way until we reach $T(2^0)$:

Assuming $T(1) = T(0) + 1$ and $T(0) = 1$:

Therefore:

Since $k = \log_2 n$, we have:

Thus, the time complexity for this case is $O(\log n)$.

### $n$ is even but not a power of 2, or $n$ is odd

Cases 3 and 4:#### $n$ is odd:

WhenFor $n$ odd:

Since $n - 1$ is even, we can then apply:

Therefore:

#### $n$ is even but not a power of 2:

WhenFor $n$ even and not a power of 2, we repeatedly apply the recurrence:

However, $\displaystyle\frac{n}{2}$ may be odd or even. We continue this process, reducing $n$ until it becomes either 1 or 0.

### Generalization Using Binary Representation

An effective way to analyze the recurrence is by considering the binary representation of $n$.

- Each time we halve $n$ (when $n$ is even), it corresponds to a right shift in binary.
- Each time we subtract 1 (when $n$ is odd), it flips the least significant bit from 1 to 0.

We can model the total time $T(n)$ as the sum of:

- The number of times we halve $n$: equal to the number of bits in $n$.
- The number of times we subtract 1: equal to the number of ones in the binary representation of $n$.

Let:

- $\text{BitLength}(n) = \lfloor \log_2 n \rfloor + 1$
- $\text{HammingWeight}(n) = \text{number of ones in binary representation of } n$

Then, the total time complexity is:

for some constants $c_1$ and $c_2$.

Since both $\text{BitLength}(n)$ and $\text{HammingWeight}(n)$ are $O(\log n)$, we have:

### Conclusion

The time complexity of the algorithm is $O(\log n)$.

**Key Takeaways:**

- The recurrence alters its strategy based on whether $n$ is even or odd.
- By analyzing the binary representation of $n$, we can determine the time complexity.
- The total time complexity depends on both the length of $n$ in binary and the number of ones in its binary representation.
- The algorithm runs in logarithmic time relative to the input size $n$.

**Final Result:**