algorithms

https://github.com/donnemartin/interactive-coding-challenges

Algorithm is a step-by-step procedure, which defines a set of instructions to be executed in a certain order to get the desired output

**Recursive** calls itself repeatedly until the problem is solved.
**Greedy algorithm**
The algorithm makes the optimal choice at each step.
Greedy algorithms are quite successful in some problems but problematic in others like de following. In the animation below, the greedy algorithm seeks to find the path with the largest sum.

**Divide and conquer** is a way to break complex problems into smaller problems that are easier to solve, and then combine the answers to solve the original problem. Ex: mergesort, quicksort, calculating Fibonacci numbers...

**Dynamic programing:**
These algorithms work by remembering the results of the past run and using them to find new results.
Make the algorithm more efficient storing some intermediate results. Fibonacci. 1,1,2,3,5,8...
Dynamic programming is a widely used concept and its often used for optimization. It refers to simplifying a complicated problem by breaking it down into simpler sub-problems in a recursive manner usually a bottom-up approach. There are two key attributes that a problem must have in order for dynamic programming to be applicable "Optimal substructure" and "Overlapping sub-problems". To achieve its optimization, dynamic programming uses a concept called memoization

**Brute force:** A brute force algorithm blindly iterates all possible solutions to search one or more than one solution that may solve a function.

**Backtranking:** solves a subproblem and if it fails to solve the problem, it undoes the last step and starts again to find the solution to the problem.

Notation that allows us to represent the complexity of an algorithm when n -> infinite.

O(1) < O(log n) < O(n) < O(n log n) < O(n²) < O(2 ^ n ) < O(n!)

Logarithms.

log2 n = x => times 2 to be n

- O(1)
public function isFirstElementNull(array $elements) { return $elements[0] == null; }

- O(log2 n)

Binary search

n (length) | log2n |
---|---|

1 | 0 |

2 | 1 |

4 | 2 |

8 | 3 |

16 | 4 |

How many times must we divide our original array size (n) in halt until we get down to 1?

n * 1/2 * 1/2 * .... = 1 =>

n * (1/2)^x = 1 =>

n * 1/2^x = 1 =>

n/2 ^ x = 1 =>

n = 2^x

- O(n): for loop
- O(n²): for inside a for loop
O (2 pow n)

function fibonacci(int number) { if (number <= 1) return number; return Fibonacci(number - 2) + Fibonacci(number - 1); }

- O(n!)

A way to store data. A way of collecting and organizing data in such a way that we can perform operations on these data in an effective way. Types: Array, hash table ( value key with hash function and collision handling method), linked list, graph, tree, queue, stack

- Linear search: O(n)
- Binary search: For a sorted list => O(log2 n)

Selection sort

Select the smallest and put in 0

Select the smallest and put in 1

*PseudoCode*

for (every element) { for (find the smallest) { if (smaller) => swap } }

O(n²)

- Insertion sort

for (all the elements) for (search where to insert) insert

O(n²)

- Merge Sort

Split until just leafs and merge sorting

function merge(){} function mergeSort(){ if (leaf) { return x } else { return merge(mergeSort(1s half), mergeSort(2nd half)) } }

Divide halfs => log2 n

Each merge => nc => n

O(n log2 n)

- QuickSort

function quickSort(){ if (list empty) { return list } else { choose pivot return concat (quickSort(list < pivot), quickSorty(list > pivot)); } }

Best case: Quicksort will have a best-case running time when the pivot at each recursive call is equal to the median element of the subarray. This means that, at each step, the problem size is being halved, and the array can be sorted with nested calls. Each call takes time (from the division step), so the total run time of the best-case quicksort is . O(n log n)

Worst case: T(n) = T(n-1)+O(n) => O(n²)

Radix Sort

![enter image description here](https://3.bp.blogspot.com/-v20s8U28-oM/WQ_9Wt9NZOI/AAAAAAAAAPU/SsfWmHOheXsTySVVoJluwk7HlXzPJOsYgCLcB/s1600/lulz.png =100x20)

O(kn)

- Orders

Pre: dad-left-rigth

post: left-right-dad

in: left-dad-right

Depth First Traversal Breadth First Traversal

Binary searh tree

- AVL tree: Selft-balancing binary searh tree where the heights of two child subtrees of a node will differ by a maximum of 1 enter image description here ### Recursion All recursive algorithm can be implemented iterativily

algorithms.txt · Last modified: 2020/07/02 16:06 (external edit)

Except where otherwise noted, content on this wiki is licensed under the following license: CC Attribution-Share Alike 4.0 International