Adam Optimizer for Neural Networks with 0.05 learning rate and 5e-7 decay

Adam Optimizer for Neural Networks with 0.05 learning rate and 5e-7 decay