algorithms [Docs]

User Tools

Site Tools


Algorithms (Types)

Algorithm is a step-by-step procedure, which defines a set of instructions to be executed in a certain order to get the desired output


Recursive calls itself repeatedly until the problem is solved. Greedy algorithm The algorithm makes the optimal choice at each step. Greedy algorithms are quite successful in some problems but problematic in others like de following. In the animation below, the greedy algorithm seeks to find the path with the largest sum.

Divide and conquer is a way to break complex problems into smaller problems that are easier to solve, and then combine the answers to solve the original problem. Ex: mergesort, quicksort, calculating Fibonacci numbers...

Dynamic programing: These algorithms work by remembering the results of the past run and using them to find new results. Make the algorithm more efficient storing some intermediate results. Fibonacci. 1,1,2,3,5,8... Dynamic programming is a widely used concept and its often used for optimization. It refers to simplifying a complicated problem by breaking it down into simpler sub-problems in a recursive manner usually a bottom-up approach. There are two key attributes that a problem must have in order for dynamic programming to be applicable "Optimal substructure" and "Overlapping sub-problems". To achieve its optimization, dynamic programming uses a concept called memoization

Brute force: A brute force algorithm blindly iterates all possible solutions to search one or more than one solution that may solve a function.

Backtranking: solves a subproblem and if it fails to solve the problem, it undoes the last step and starts again to find the solution to the problem.

Big O

Notation that allows us to represent the complexity of an algorithm when n -> infinite.


O(1) < O(log n) < O(n) < O(n log n) < O(n²) < O(2 ^ n ) < O(n!)


log2 n = x => times 2 to be n

  • O(1)
    public function isFirstElementNull(array $elements)
     return $elements[0] == null;
  • O(log2 n)

Binary search

n (length) log2n
1 0
2 1
4 2
8 3
16 4

How many times must we divide our original array size (n) in halt until we get down to 1?

n 1/2 1/2 * .... = 1 =>

n * (1/2)^x = 1 =>

n * 1/2^x = 1 =>

n/2 ^ x = 1 =>

n = 2^x

  • O(n): for loop
  • O(n²): for inside a for loop
  • O (2 pow n)

    function fibonacci(int number)
          if (number <= 1) return number;
          return Fibonacci(number - 2) + Fibonacci(number - 1);
  • O(n!)

Data structure

A way to store data. A way of collecting and organizing data in such a way that we can perform operations on these data in an effective way. Types: Array, hash table ( value key with hash function and collision handling method), linked list, graph, tree, queue, stack

Searching tecniques (Array)

  • Linear search: O(n)
  • Binary search: For a sorted list => O(log2 n)

Array sorting techniques (selection, insertion, merge, quick, bubble, radix)

  • Selection sort

    Select the smallest and put in 0

    Select the smallest and put in 1


for (every element) {
    for (find the smallest) {
         if (smaller) => swap 
  • Insertion sort

enter image description here

for (all the elements)
   for (search where to insert) insert


  • Merge Sort

Split until just leafs and merge sorting

function merge(){}
function mergeSort(){
	if (leaf) {
		return x
	} else {
		return merge(mergeSort(1s half), mergeSort(2nd half))

enter image description here

enter image description here

Divide halfs => log2 n

Each merge => nc => n

O(n log2 n)

  • QuickSort

enter image description here

function quickSort(){
	if (list empty) 
		return list
	else {
		choose pivot
		return concat (quickSort(list < pivot), quickSorty(list > pivot));

Best case: Quicksort will have a best-case running time when the pivot at each recursive call is equal to the median element of the subarray. This means that, at each step, the problem size is being halved, and the array can be sorted with nested calls. Each call takes time (from the division step), so the total run time of the best-case quicksort is . O(n log n)

Worst case: T(n) = T(n-1)+O(n) => O(n²)

![enter image description here]( =100x20)


DS tables Big O

enter image description here

Trees (Order, types)

  • Orders

enter image description here

Pre: dad-left-rigth

post: left-right-dad

in: left-dad-right

Depth First Traversal enter image description here Breadth First Traversal

enter image description here


Binary searh tree

enter image description here enter image description here

  • Binary heap: Complete binary tree which satifies the heap ordering property:
    • mi heap: each node >= parent
    • max heap: each node <= parent insert 2: enter image description here Delete minimun enter image description here
  • Height balanced tree (Self-balanced binary sarch tree) enter image description here
  • AVL tree: Selft-balancing binary searh tree where the heights of two child subtrees of a node will differ by a maximum of 1 enter image description here ### Recursion All recursive algorithm can be implemented iterativily
algorithms.txt · Last modified: 2020/07/02 16:06 (external edit)