TransWikia.com

What is the correct terminology for neural network architectures that expand in their internal layers? (The opposite of a "bottleneck")

Data Science Asked by Rehno Lindeque on May 15, 2021

I am aware that an auto-encoder contracts in its hidden layers to form a bottleneck.

In contrast to this, is there a good name for the kind of cell, block or architecture that expands in its internal layer(s) and then contracts in its output/interface layer(s).

For example, Deep Equilibrium Models have a residual cell that expands in its internal layer and then contracts at its interface.

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP