Welcome! In my previous post, I wrote about the use of different notations for Asymptotic (Algorithmic) Analysis. Today, let’s discuss how the analysis is done.

## Why is it called Asymptotic Analysis?

In every case, for a given function, f(n), we define a function g(n) which approximates f(n) at higher values of n. Thus, g(n) is a curve, approximating f(n) at higher values of n. Such a curve is called an Asymptotic Curve. For this reason, the Algorithm Analysis is called Asymptotic Analysis.

## Rates of Growth

The Rate at which the running time increases as a function of input is called rate of growth. In case of a total time of execution

`C`

, since the smaller elements and constants are insignificant for large values of n, the actual rate of grown is considered to be _{0}n^{2} + C_{1} + (C_{2} + C_{3})n`n`

.^{2}

This, rate of growth: `O(n`

^{2})

Here is a chart with commonly used Rates of Growth:

## Performing Asymptotic Analysis

So, with the theory out of our way, how do we actually perform the Analysis?

There are a few general rules I read and have compiled to the best of my abilities. Read on to know more!

#### 1. Consecutive Statements

The running time of each consecutive statement is addded.

```
// Constant time
a = A + 1
// Executes n times
for(i = 0; i < n; i++){
b = b + 2 // Constant time
}
```

Total time of execution: constant c * n + c = cn = O(n)

#### 2. Loops

The running time of loops is the running of all the statements inside the loop, multiplied by the number of iterations in the loop.

```
// executing n times
for(i = 0; i < n; i++){
m = m + i; // constant time
}
```

Total time of execution: a constant c * n = cn = O(n)

#### 3. Nested Loops

The running time of nested loops (in my observation) is generally the highest. Analysis is done from inside out. Total running time is the product of the sizes of all the loops.

```
// outer loop executing n times
for(i = 0; i < n; i++){
// inner loop executing n times
for(j = 0; j < n; j++ ){
m = m + i; // constant time
}
}
```

Total time of execution: a constant c * n * n = cn^{2} = O(n^{2})

#### 4. Logarithmic Complexity

An Algorithm has logarithmic complexity, if it takes a constant time to cut the problem size by a fraction (usually 1/2). As an example, let us consider the following,

```
for(i = 1; i <= n; i *= 2){
print i
}
```

The value of i is doubled after each step. so, in k times execution, after k^{th} step, 2^{k} = n, and at (k + 1)^{th} step, we come out of the loop.

log(2^{k}) = log(n)

k * log(2) = log(n)

// Assuming base – 2

k = log(n)

#### 5. If – else

Worst-case running time is calculated as:

Time(test) + Time of if block or else block, whichever is greater

` // test: constant (C`_{0})
if(length() == 0){
return false; // if : constant (C_{1})
} else {
// else : ((C_{2}) + (C_{3})) * n
for(n = 0; n < length(); n++){ // for loop : n
// nested test : (C_{2})
if ( list[n] == list[n + 1]){
return false; // if : (C_{3})
}
}
}

Total time of execution: C_{0} + C_{1} + (C_{2} + C_{3}) * n = O(n)

We generally use these guidelines to calculate the Time or Space complexity of an Algorithm.

So, that’s it for now. Stay tuned for more awesome Algorithms!

It’s really very difficult in this busy life to listen news on Television, thus I

just use internet for that reason, and obtain the latest

information.

Hi, I loog oon to yoiur blogs regularly. Yoour story-telling sttyle

iss awesome, keep doinhg what you’re doing!

Hi there! Thiis polst couldn’t bee wriotten much better!

Lookkng through this artcle remkinds mee oof myy previous roommate!

He constanly kpt preachhing about this. I mopst certainjly ill semd this information too him.

Fairlyy certaiun he’ll havee a great read. Manny thanks

foor sharing! Hi, I ddo beliece tyis iss an excellent blog.

I stumbleupon it 😉 I aam goiing to retuurn yet again sunce i

ave book-marked it. Mondy aand freedom iss thhe greeatest

way tto change, maay yoou bee ricxh and continue to hdlp other people.

http://www.cspan.net