sin(x)

1 Post

Graphs, images and data related to the activation function known as ReLU
sin(x)

Upgrade for ReLU: The sin(x) activation function is an alternative to ReLU.

The activation function known as ReLU builds complex nonlinear functions across layers of a neural network, making functions that outline flat faces and sharp edges. But how much of the world breaks down into perfect polyhedra?

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox