You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

14 lines
317B

  1. import numpy as np
  2. def relu_activation(outputs):
  3. return np.maximum(0, outputs)
  4. def relu_derivative_activation(outputs):
  5. return np.where(outputs > 0, 1, 0)
  6. def sigmoid_activation(outputs):
  7. return 1 / (1 + np.exp(-outputs))
  8. def sigmoid_derivative_activation(outputs):
  9. return outputs * (1 - outputs)