# Algorithms

The conventional wisdom for when to use a linked list over contiguous storage hasn't applied for years: you have to test. If everything is in a cache, a vector might outperform a linked list for insertion.

# Sorting

See sorting

# GB CppCon 2018: Jonathan Boccara - 105 STL Algorithms in Less Than an Hour

https://www.youtube.com/watch?v=2olsGf6JIkU

## Secret runes

- stable_
- is_
- is_*_until

# Algorithm patterns

## Brute Force

## Divide and Conquer

- Karatsuba’s Integer Multiplication -- it is possible to perform multiplication of large numbers in (many) fewer operations than the usual brute-force technique of "long multiplication." As discovered by Karatsuba (Karatsuba and Ofman 1962).

## Decrease and Conquer

## The Greedy Method

## Dynamic Programming

## Backtracking

## Branch and Bound

"Branch and bound (BB, B&B, or BnB) is an algorithm design paradigm for discrete and combinatorial optimization problems."

## Hill Climbing

## Particle Swarm Optimisation

## Las Vegas

## Monte Carlo

## Reduction (Transformation)

## Preprocessing

## Gradient descent

"Gradient descent is a first-order iterative optimization algorithm for finding a local minimum of a differentiable function."

See algorithm patterns.

# Big-O notation

Big-O notations is a technique to describe the complexity of an algorithm as the data set becomes larger.

Be prepared to write code. Remember your merge sort, quick sort, binary search, etc, and be able to write them cold.

Complexity | name |
---|---|

O(1) | constant |

O(log n) | logarithmic |

O(n) | linear |

O(n log n)=O( log n!) | linearithmic |

O(n^2) | quadratic |

O(n^c) | polynomial |

O(c^n) | exponential |

O(n!) | factorial |

See the Big-O cheat sheet and time complexity

# Self-balancing binary search trees

A balanced tree is one of height O(log n), where n is the number of nodes in the tree.

# Memory access

That assumes all memory accesses cost the same, which is not a physically reasonable assumption as we scale n to infinity, and not, in practice, how real computers work. This argument extends from the observation that computers are filled with different types of memory (cache, system memory, virtual memory etc.) in different and limited quantities. Modern operating systems will position variables into these different systems automatically making memory access time differ widely as more and more memory is utilized.