5

Scikit-learn lists these as the implemented activation functions for it's multi-layer perceptron classifier:

‘identity’, no-op activation, useful to implement linear bottleneck, returns f(x) = x
‘logistic’, the logistic sigmoid function, returns f(x) = 1 / (1 + exp(-x)).
‘tanh’, the hyperbolic tan function, returns f(x) = tanh(x).
‘relu’, the rectified linear unit function, returns f(x) = max(0, x)

Does anyone know if it is possible to implement a custom activation function? If not, can someone point me to a library where this is possible?

pennydreams
  • 91
  • 2
  • 7
  • 2
    sklearn wasn't made specifically for this task, so it is missing features like this. I recommend pyTorch instead; it's the latest and greatest in the field, and it's simple. Here's a relevant discussion. Welcome to the site and good luck. – Emre Apr 27 '17 at 22:53
  • Thanks so much Emre! I just got pyTorch up and running and am fiddling with the forward function right now – pennydreams Apr 28 '17 at 17:47
  • @pennydreams did you ever get to completing the function and how did you implement it? I am trying something similar and could use some pointers! – dsforlife84 Nov 06 '17 at 13:40
  • @dsforlife84 As Emre stated, it doesn't seem to be possible to implement a custom activation function in scikit-learn's MPLClassifier. I'd look at tensorflow or pytorch for implementing neural nets with custom activation functions. – pennydreams Nov 07 '17 at 14:42

1 Answers1

2

Although @Emre's comment that sklearn wasn't specifically made for the task is correct, nothing stops you from introducing another activation function by implementing it in sklearn/neural_network/_base.py. You should be able to find the file in your installation.

To this end, you would define your single-parameter function and add it in the ACTVATIONS list. Refer to the existing implementations for some guidance how to do it.

mapto
  • 744
  • 5
  • 16