For Big O Notation, On Another Page ðŸ‘‰ ðŸ‘‰

Big O Notation (With Examples)ðŸ™‚

Unlike Big-O notation, which represents only upper bound of the running time for some algorithm, Big-Theta is a tight bound; both upper and lower bound. Tight bound is more precise, but also more difficult to compute.

The **Big-Theta notation is symmetric**: f(x) = Ó¨(g(x)) <=> g(x) = Ó¨(f(x))

An intuitive way to grasp it is that f(x) = Ó¨(g(x)) means that the graphs of f(x) and g(x) grow in the same rate, or that the graphs ‘behave’ similarly for big enough values of x.

The full mathematical expression of the Big-Theta notation is as follows:

Ó¨(f(x)) = { g: N0 -> R and c1, c2, n0 > 0, where c1 < abs(g(n) / f(n)), for every n > n0 and abs is the absolute value }

**An example**

If the algorithm for the input n takes 42n^2 + 25n + 4 operations to finish, we say that is O(n^2) , but is also O(n^3) and O(n^100) . However, it is Ó¨(n^2) and it is not Ó¨(n^3) , Ó¨(n^4) etc. Algorithm that is Ó¨(f(n)) is also O(f(n)) , but not vice versa!

**Formal mathematical definition**

Ó¨(g(x)) is a set of functions.

Ó¨(g(x)) = {f(x) such that there exist positive constants c1, c2, N such that 0 <= c1*g(x) <= f(x) <= c2*g(x) for all x > N}

Because Ó¨(g(x)) is a set, we could write f(x) âˆˆ Ó¨(g(x)) to indicate that f(x) is a member of Ó¨(g(x)) . Instead, we will usually write f(x) = Ó¨(g(x)) to express the same notion – that’s the common way.

Whenever Ó¨(g(x)) appears in a formula, we interpret it as standing for some anonymous function that we do not care to name. For example the equation T(n) = T(n/2) + Ó¨(n) , means T(n) = T(n/2) + f(n) where f(n) is a function in the set Ó¨(n) .

Let f and g be two functions defined on some subset of the real numbers. We write f(x) = Ó¨(g(x)) as

x->infinity if and only if there are positive constants K and L and a real number x0 such that holds:

```
K|g(x)| <= f(x) <= L|g(x)| for all x >= x0
The definition is equal to:
f(x) = O(g(x)) and f(x) = Î©(g(x))
A method that uses limits
if limit(x->infinity) f(x)/g(x) = c âˆˆ (0,âˆž) i.e. the limit exists and it's positive, then f(x) = Ó¨(g(x))
```

**Common Complexity Classes**

Name | Notation | n = 10 | n = 100 |
---|---|---|---|

Constant | Ó¨(1) | 1 | 1 |

Logarithmic | Ó¨(log(n)) | 3 | 7 |

Linear | Ó¨(n) | 10 | 100 |

Linearithmic | Ó¨(n*log(n)) | 30 | 700 |

Quadratic | Ó¨(n^2) | 100 | 10 000 |

Exponential | Ó¨(2^n) | 1 024 | 1.27E+30 |

Factorial | Ó¨(n!) | 3 628 800 | 9.33E+157 |

## Comparison of the asymptotic notations

Let f(n) and g(n) be two functions defined on the set of the positive real numbers, c, c1, c2, n0 are positive real constants.

The asymptotic notations can be represented on a Venn diagram as follows:

## Big-Omega Notation

Î©-notation is used for asymptotic lower bound.

**Formal definition**

Let f(n) and g(n) be two functions defined on the set of the positive real numbers. We write f(n) = Î©(g(n)) if

there are positive constants c and n0 such that: 0 â‰¤ c g(n) â‰¤ f(n) for all n â‰¥ n0 .

**Notes**f(n) = Î©(g(n)) means that f(n) grows asymptotically no slower than g(n) . Also we can say about Î©(g(n)) when algorithm analysis is not enough for statement about Î˜(g(n)) or / and O(g(n)) .

From the definitions of notations follows the theorem:

For two any functions f(n) and g(n) we have f(n) = Ó¨(g(n)) if and only if f(n) = O(g(n)) and f(n) = Î©(g(n)) .

Graphically Î©-notation may be represented as follows:

For example lets we have f(n) = 3n^2 + 5n – 4 . Then f(n) = Î©(n^2) . It is also correct f(n) = Î©(n) , or even f(n) = Î©(1) .

Another example to solve perfect matching algorithm : If the number of vertices is odd then output “No Perfect Matching” otherwise try all possible matchings.

We would like to say the algorithm requires exponential time but in fact you cannot prove a Î©(n^2) lower bound using the usual definition of Î© since the algorithm runs in linear time for n odd. We should instead define f(n)=Î©(g(n)) by saying for some constant c>0 , f(n)â‰¥ c g(n) for infinitely many n . This gives a nice

correspondence between upper and lower bounds: f(n)=Î©(g(n)) iff f(n) != o(g(n)) .

Formal definition and theorem are taken from the book “Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, Clifford Stein. Introduction to Algorithms”.

For Big O Notation, On Another Page ðŸ‘‰ ðŸ‘‰

Big O Notation (With Examples)ðŸ™‚

**Morae Q!**

- OS operating system module using path parameter.
- Find the smallest and largest integers after sorting using bubble sort.
- Find the integer and its number of occurrences.
- Algorithm complexity – Big O Notation with Examples.
- Linear search.
- Map module using series mapping and parallel mapping.
- Mapping and Transposing with Map Module.
- Map Module/built-in Function.
- Linux commands for managing files.
- Program which takes two lists and maps two lists into a dictionary.
- Splitting strings and substituting using regular expressions regex.
- Basics of regular expression Regex.
- Find the square value of the dictionary.
- Check whether the given key is present in the dictionary.
- Matching an expression only in specific places using regular expressions.
- Escaping special characters using regular expressions.
- Check the key.
- Grouping and sub-grouping using regular expressions.
- Generate a Dictionary that Contains Numbers in the Form (x , x*x).
- Algorithm complexity Big-Theta, Big-Omega and Big-O Notations.