]]>Thank Nyx and Reedbeta very much!

I used the sigmoid functions for my problem. But the output signals are only in range (0.49 - 0.50). I feel they are not good. Can I interpret outputs \< 0.5 like 0, > 0.5 like 1? If I do that, the error of problem is large.

Can You help me!

]]>You can use the sigmoid function... But you should remember that it's output is asymptotic... The sigmoid function "goes to 0" at minus infinity, and "goes to 1" at infinity, but never actually reaches those values. Training your network to reach them can result in very large weights, which does not help convergence.

If you use the sigmoid function, you should probably train your network to output values in a restricted range like [0.25, 0.75], the lowest representing 0, and then interpret outputs under 0.5 as being 0, and over 0.5 as 1.

]]>What do you mean by "result is not good"? Sigmoid and similar functions are the standard choice for multilayer feed-forward networks.

]]>I am freshman in NN. Now, I am working Back propagation. I have the analogue input and binary output. I dont know what activation functions I use for backpropagetion with the single hidden layer.

Please, help me.

I program the backpropgation based on errors. If I used sigmoid function then result is not good, and step function I cant.

Thanks for help.