Quantcast
Channel: User darkcanuck - Stack Overflow
Browsing all 5 articles
Browse latest View live

Comment by darkcanuck on Can you help me with linear activation of my Simple...

I would recommend both. For regression you also can use sigmoid/tanh activation in the hidden layer since it adds non-linearity to the function approximation. For the output layer I would normally pick...

View Article



Answer by darkcanuck for Connect 4 with neural network: evaluation of draft +...

I've implemented neural networks before, and see a few problems with your proposed architecture:A typical multi-layer network has connections from every input node to every hidden node, and from every...

View Article

Answer by darkcanuck for Neural Network Always Produces Same/Similar Outputs...

Based on your comments, I'd agree with @finnw that you have a bias problem. You should treat the bias as a constant "1" (or -1 if you prefer) input to each neuron. Each neuron will also have its own...

View Article

Answer by darkcanuck for java simple neural network setup

It looks like a good starting point. I do have a few suggestions:For scalability, fire() should be restructured so that a neuron that's already fired with the current input set doesn't have to...

View Article

Answer by darkcanuck for TD(λ) in Delphi/Pascal (Temporal Difference Learning)

If you're serious about making this work, then understanding TD-lambda would be very helpful. Sutton and Barto's book, "Reinforcement Learning" is available for free in HTML format and covers this...

View Article

Browsing all 5 articles
Browse latest View live




Latest Images