--- type: mixed --- [Divide and Conquer](Divide%20and%20Conquer.md) ## Loops do not exist in Haskell So we have to use recursion! ```haskell funcName = ... name ... ``` where `args'` is the augmented args (recursive). using if-then-else ```haskell factorial :: Int -> Int factorial n = if n == 0 then 1 else n * factorial(n-1) ``` ## Guards Moving from the `factorial` example: ```haskell factorial n | n == 0 = 1 | otherwise = n * factorial (n-1) -- Catch all ``` Note the indentation and the pipes (`|`). We can add any amount of conditions, unlike the if-then else. ## Pattern matching i.e. with the `factorial` example. `_` is a wildcard (ignore the value). Note how we are "re-defining" the function ```haskell factorial :: Int -> Int factorial 0 = 1 -- Base case: when n is 0 factorial n = n * factorial (n - 1) -- Recursive call with n-1 ``` ## Accumulators A variable that **accumulates** or **stores** a running total or result during the execution of a function, especially in loops or recursive functions. It is essentially a helper function. ```haskell factorial :: Int -> Int factorial n = factorialHelper n 1 where factorialHelper 0 acc = acc -- Base case: when n is 0, return the accumulator factorialHelper n acc = factorialHelper (n - 1) (n * acc) -- Multiply n by accumulator and recurse ``` In this example, by using an accumulator and tail-recursion[^1] we achieve a $\mathcal{O}(n)$ time complexity [^2]. We should **always strive for tail-recursive algorithms**, as normal recursion *can* cause stack overflow. ## Function composition In Haskell, the **composition operator** is `(.)`. It allows us to compose two functions together into a new function. The operator is defined as: ```haskell (f . g) x = f (g x) ``` i.e. we have 2 functions: ```haskell increment :: Int -> Int increment x = x + 1 square :: Int -> Int square x = x * x ``` we can combine them like so: ```haskell incrementThenSquare :: Int -> Int incrementThenSquare = square . increment ``` --- [^1]In tail recursion, the recursive call is the ***last operation*** in the function. This means that once a recursive call is made, there’s no need to retain the current function’s state or stack frame. [^2] Computational limits still exist! Although the time complexity is perceived as $\mathcal{O}(n)$, that may not actually be the case, as computers are slow.