Regression Demo with rectified linear (ReLU) activation function
Regression Demo with rectified linear (ReLU) activation function
Example of neural network fitment to sine wave using 2 fully-connected hidden layers of 64 neurons each, with the rectified linear activation function.