Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
463 views
in Technique[技术] by (71.8m points)

machine learning - Why the BIAS is necessary in ANN? Should we have separate BIAS for each layer?

I want to make a model which predicts the future response of the input signal, the architecture of my network is [3, 5, 1]:

  • 3 inputs,
  • 5 neurons in the hidden layer, and
  • 1 neuron in output layer.

My questions are:

  1. Should we have separate BIAS for each hidden and output layer?
  2. Should we assign weight to BIAS at each layer (as BIAS becomes extra value to our network and cause the over burden the network)?
  3. Why BIAS is always set to one? If eta has different values, why we don't set the BIAS with different values?
  4. Why we always use log sigmoid function for non linear functions, can we use tanh ?
See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

So, I think it'd clear most of this up if we were to step back and discuss the role the bias unit is meant to play in a NN.

A bias unit is meant to allow units in your net to learn an appropriate threshold (i.e. after reaching a certain total input, start sending positive activation), since normally a positive total input means a positive activation.

For example if your bias unit has a weight of -2 with some neuron x, then neuron x will provide a positive activation if all other input adds up to be greater then -2.

So, with that as background, your answers:

  1. No, one bias input is always sufficient, since it can affect different neurons differently depending on its weight with each unit.
  2. Generally speaking, having bias weights going to every non-input unit is a good idea, since otherwise those units without bias weights would have thresholds that will always be zero.
  3. Since the threshold, once learned should be consistent across trials. Remember the bias represented how each unit interacts with the input; it isn't an input itself.
  4. You certainly can and many do. Any sqaushing function generally works as an activation function.

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

1.4m articles

1.4m replys

5 comments

57.0k users

...