TransWikia.com

Network Size in neural network

Data Science Asked by user3159445 on June 25, 2021

What are the limitations of having too many hidden units in the Neural Network
Does it take more memory or takes longer time to train the model

2 Answers

The number of hidden units is a hyperparameter which increases the model capacity. Therefore, increasing it not only increases time and memory requirements but can also lead to overfitting (also see section 11.4.1 in the Deep Learning Book).

Answered by Sammy on June 25, 2021

As you increase your hidden layers in a neural network, the weights associated with the nodes are also increased. They increase memory requirements for your model. Also, as there are more weights, more time is needed to optimize them, hence, increase the time taken for processing.
As layers are increased our model accuracy starts to degrade due to over-fitting.

ResNet is a great example , where they skip layers in an extremely deep neural networks to achieve great performance.

Answered by Shiv on June 25, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP