Adam Optimizer for Neural Networks with 0.02 learning rate and 1e-5 decay

Adam Optimizer for Neural Networks with 0.02 learning rate and 1e-5 decay