Are bias weights needed in output layer?

Multi tool use
Multi tool use


Are bias weights needed in output layer?



I am implementing a neural network.
it is composed with LSTM and 1 dense output layer.
I use bias weights for the output layer.
but, I don't know it is really needed.
LSTM has bias weights also. What is the main reason to use bias weights in output layer.



Additionally, I'm using the 'sampled_softmax_loss' function. So, the output layer is just projection layer. There is no activation function such as ReLU.
I tested both of models those are with bias and without bias, but the result was almost same.
So, I want to know the bias is really effective at the final projection layer in general.



Thanks in advance.





Possible duplicate of Role of Bias in Neural Networks
– Willy satrio nugroho
Jun 26 at 7:12





How else do you plan on going from the last layer to the output layer? You need to do a dot product with the weights to get the output. Are you planning on just summing the second to last layer values?
– kamykam
Jun 26 at 9:24





I am using 'sampled_softmax_loss'. Actually, the output layer that I mentioned is the projection layer inside of 'sampled_softmax_loss'. I'm using 'sampled_softmax_loss' at training phase and 'argmax' at inference phase. This is a typical classification problem. Thanks.
– William Anderson
Jul 1 at 4:56










By clicking "Post Your Answer", you acknowledge that you have read our updated terms of service, privacy policy and cookie policy, and that your continued use of the website is subject to these policies.

M,cN3 gMqL7LRByzK
7 9cfvwkQxKZ6hUTgbybRhYIRMot,LtF65IvO6vN rr,AmvZZSxjK Uur1TRq,uva,uAzzaJrO6rQQ8umX,wIS a7Gx,tBbmytSVUlA6

Popular posts from this blog

PySpark - SparkContext: Error initializing SparkContext File does not exist

django NoReverseMatch Exception

List of Kim Possible characters