Sinus Activation

We can approximate the tanh function with the non-monotonic sinus function.

![[CleanShot 2023-11-18 at 22.57.43@2x.png]]

We can rewrite multiplications of sinus functions into sums of sinus functions which is a nice idea as sums are easier to compute than multiplications.

Also, sinus partly avoids vanishing Gradients in Backpropagation.