Which, if any, of the following propositions is true about fully-connected neural networks (FCNN)?
- In a FCNN, there are connections between neurons of a same layer.
- In a FCNN, the most common weight initialization scheme is the zero initialization, because it leads to faster and more robust training.
- A FCNN with only linear activations is a linear network.
- None of the above
C. A FCNN with only linear activations is a linear network.
need an explanation for this answer? contact us directly to get an explanation for this answer