Neural Network

Basic Neural Network architecture

![[CleanShot 2023-11-17 at 17.49.25@2x.png]]

  • Weights
  • Bias Weight
    • needed to allow for general hyperplanes that do not necessarily go through the origin
    • in RNNs a missing bias weight can be corrected by one of the internal state neurons.
  • Activation Function
  • Backpropagation

A neural network can always be expanded by the concatenation of mutliple layers in contrast to Taylor Expansion where complexity is increased with additive terms.

Universal Approximation Theorem

Existence Theorem 3-layer neural nets can approximate any continous function on a compact domain.

For example:

Specialized Networks