## In this article, we explain the data structure analysis of algorithms. It is a very important role play learning a data structure.

### What is an algorithm?

An algorithm is a sequence of unambiguous instructions for solving a problem for obtaining a required output for any legitimate input in a finite amount of time.

- Can be represented in various forms
- Unambiguity/clearness
- Effectiveness
- Finiteness/termination
- Correctness
- Recipe, process, method, technique, procedure, routine with the following requirements:
- Finiteness
- terminates after a finite number of steps
- Definiteness
- rigorously and unambiguously specified
- Clearly specified input
- valid inputs are clearly specified
- Clearly specified/expected output
- can be proved to produce the correct output given a valid input
- Effectiveness
- steps are sufficiently simple and basic

Why study algorithms?

Algorithms are the theoretical importance of the core of computer science and also the Practical importance of a practitioner’s toolkit of known algorithms. it is using a Framework for designing and analyzing algorithms for new problems. The algorithm is an example of google PageRank technology.

#### Computational Problems

- Minimum spanning tree
- Primality testing
- Traveling salesman problem
- Knapsack problem
- Chess
- Towers of Hanoi
- Program termination
- Shortest paths in a graph
- SortingSearching

#### Basic Issues Related to Algorithms

- How to design algorithms
- How to express algorithms
- Proving correctness
- Efficiency (or complexity) analysis
- Theoretical analysis
- Empirical analysis
- Optimality

#### Euclid’s Algorithm

#### Algorithm design techniques

- Brute force
- Divide and conquer
- Decrease and conquer
- Transform and conquer
- Space and time tradeoffs
- Greedy approach
- Dynamic programming
- Iterative improvement
- Backtracking
- Branch and bound

#### Analysis of algorithms

If time efficiency and space efficiency are good then Algorithms are very powerful. lower bounds and optimality exist a better algorithm.

#### Selection Sort

#### What are Graph Problems and how to solve them?

- Modeling real-life problems
- Modeling WWW
- Communication networks
- Project scheduling …
- Examples of graph algorithms
- Graph traversal algorithms
- Shortest-path algorithms
- Topological sorting

#### Linear Data Structures

- fixed length (need a preliminary reservation of memory)
- contiguous memory locations
- direct access
- Insert/delete

#### Linked List

- Singly-linked list (next pointer)
- Doubly linked list (next + previous pointers)
- dynamic length
- arbitrary memory locations
- access by following links
- Insert/delete

#### What are Stacks and Queues?

**Priority Queue and Heap**

Priority queues are implemented using heaps. A data structure for maintaining a set of elements, each associated with a key/priority, with the following operations

Finding the part with the very best priority and Deleting the element with the highest priority the Inserting a brand new element of programming jobs on a shared computer.

**Adjacency matrix**

n x n boolean matrix if |V| is n.

The component on the ith row Associate in Nursingd jth column is one if there’s a footing from ith vertex to the jth vertex; otherwise 0. The contiguity matrix of an aimless graph is symmetric.

**Adjacency linked lists **are a collection of linked lists, one for each vertex, that contain all the vertices adjacent to the list’s vertex.

**What are Trees **- A tree is a connected acyclic graph. graph that has no cycles but is not necessarily connected.

For every two vertices in a tree, there always exists exactly one simple path from one of these vertices to the other. Why?

Rooted trees: The above property makes it possible to select an arbitrary vertex in a free tree and consider it as the root of the so-called rooted tree.

#### What are Rooted Trees

**Ancestors**

For any vertex v in a tree T, all the vertices on the simple path from the root to that vertex are called ancestors.

**Descendants**

All the vertices for which a vertex v is an ancestor are said to be descendants of v.

Parent, child, and siblings

If (u, v) is the last edge of the simple path from the root to vertex v, u is said to be the parent of v, and v is called a child of u.

Vertices that have the same parent are called siblings.

**Leaves**

A vertex without children is called a leaf.

**Subtree**

A vertex v with all its descendants is called the subtree of T rooted at v.

Depth of a vertex

The length of the simple path from the root to the vertex.

Height of a tree

The length of the longest simple path from the root to a leaf.

#### What are Ordered trees?

An ordered tree is a rooted tree in which all the children of each vertex are ordered.

Binary trees

A binary tree is an ordered tree in which every vertex has no more than two children and each child is designated s either a left child or a right child of its parent.

Binary search trees

Each vertex is assigned a number.

A number assigned to each parental vertex is larger than all the numbers in its left subtree and smaller than all the numbers in its right subtree.

log2n n – 1, where h is the height of a binary tree and n the size.

#### What are the Issues and Approaches of Analysis of algorithms?

Issues: | Approaches: |
---|---|

correctness | theoretical analysis |

time efficiency | empirical analysis |

space efficiency | - |

optimality | - |

#### What is notation?

#### Asymptotic order of growth

- O(g(n)): class of functions f(n) that grow no faster than g(n)
- Θ(g(n)): class of functions f(n) that grow at same rate as g(n)
- Ω(g(n)): class of functions f(n) that grow at least as fast as g(n)

#### Plan for Analysis of Recursive Algorithms

- Decide on a parameter indicating an input’s size.
- Identify the algorithm’s basic operation.
- Check whether the number of times the basic op. is executed may vary on different inputs of the same size. (If it may, the worst, average, and best cases must be investigated separately.)
- Set up a recurrence relation with an appropriate initial condition expressing the number of times the basic op. is executed.
- Solve the recurrence (or, at the very least, establish its solution’s order of growth) by backward substitutions or another method.

#### Exhaustive Search

**Method:**

- generate a list of all potential solutions to the problem in a systematic manner.
- evaluate potential solutions one by one, disqualifying infeasible ones, and, for an optimization problem, keeping track of the best one found so far.
- when the search ends, announce the solution(s) found.

#### TSP by Exhaustive Search

**Tour**

**Cost**

### Decrease-and-Conquer

- Reduce problem instance to smaller instance of the same problem
- Solve smaller instance
- Extend solution of smaller instance to obtain a solution to the original instance
- Can be implemented either top-down or bottom-up
- Also referred to an as inductive or incremental approach

**3 Types of Decrease and Conquer**

- Decrease by a constant
- insertion sort
- graph traversal algorithms (DFS and BFS)
- topological sorting
- algorithms for generating permutations, subsets
- Decrease by a constant factor (usually by half)
- binary search and bisection method
- exponentiation by squaring
- multiplication à la russe
- Variable-size decrease
- Euclid’s algorithm
- selection by partition
- Nim-like games

**Insertion Sort**

#### Analysis of Insertion Sort

Graph Traversal

- Depth-first search (DFS)
- Breadth-first search (BFS)

#### Depth-First Search (DFS)

**Recursive or it uses a stack**

- a vertex is pushed onto the stack when it’s reached for the first time
- a vertex is popped off the stack when it becomes a dead-end, i.e., when there is no adjacent unvisited vertex
- “Redraws” graph in tree-like fashion (with tree edges and back edges for undirected graph)

Applications:

- checking connectivity, finding connected components
- checking acyclicity (if no back edges)
- finding articulation points and biconnected components
- searching the state-space of problems for solutions (in AI)

Breadth-first search (BFS)

#### Fake-Coin Puzzle

Decrease by factor 2 algorithm

Decrease by factor 3 algorithm (Q3 on page 187 of Levitin)

Variable-Size-Decrease Algorithms

In the variable-size-decrease variation of decrease-and-conquer,

instance size reduction varies from one iteration to another

Examples:

- Euclid’s algorithm for greatest common divisor
- Partition-based algorithm for the selection problem
- Interpolation search
- Some algorithms on binary search trees
- Nim and Nim-like games
- Euclid’s Algorithm
- Euclid’s algorithm is based on repeated application of equality
- gcd(m, n) = gcd(n, m mod n)
- Ex.: gcd(80,44) = gcd(44,36) = gcd(36, 8) = gcd(8,4) = gcd(4,0) = 4
- One can prove that the size, measured by the first number,
- decreases at least by half after two consecutive iterations.
- Hence, T(n) O(log n)
- Algorithms for the Selection Problem
- The sorting-based algorithm: Sort and return the kth element Efficiency (if sorted by mergesort): Θ(log n)
- A faster algorithm is based on using the quicksort-like partition of the list. Let s be a split position obtained by a partition (using some pivot):
- Assuming that the list is indexed from 1 to n:
- If s = k, the problem is solved;
- ifs > k, look for the kth smallest element in the left part; if s < k, look for the (k-s)-th smallest element in the right part.
- Note: The algorithm can simply continue until s = k.