Recursion in algorithms occurs when a function/algorithm self-references itself, or ‘calls’ itself. It can be considered an alternative approach to iteration.
Recursive algorithms can sometimes be easier to understand, but a main drawback is the large memory size occupied by the recursive function calls.
Every recursive function must include a base/terminating case, which stops it from call on itself forever
Note: Every recursive function can be converted into an iterative function and vice versa
Structure
def factorial(n): #Definition
if n == 0: #Base case, note there are no recursive calls
return 1
return n*factorial(n-1) #Recursive call
Call Stack
In programming languages, the call stack stores the various function calls, which then get evaluated. In recursive functions, the call stack behaves a bit differently:
Recursion Tree
A recursion tree is a useful way of visualising recursive algorithms. Each node in the tree represents the input size, relative to the input . Every level of the tree represents the work done out of the recursive calls.
For example, a recursive tree of mergesort:
The size of the recursive tree is used for the#todo
- Factorial Algorithm
- Fibonacci Algorithm
- Master Theorem
- Merge Sort
- Page Rank Algorithm
- Recurrence Relations