Quantcast
Channel: User darkcanuck - Stack Overflow
Viewing all articles
Browse latest Browse all 5

Comment by darkcanuck on Can you help me with linear activation of my Simple Classifier Neural Network in pyBrain?

$
0
0
I would recommend both. For regression you also can use sigmoid/tanh activation in the hidden layer since it adds non-linearity to the function approximation. For the output layer I would normally pick linear for regression, and a squashing function for classification.

Viewing all articles
Browse latest Browse all 5

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>