TransWikia.com

Role of mode in bottleneck function in unet network

Data Science Asked by s326280 on December 8, 2020

I have come across the script that belongs to a person in kaggle. The snippet is given below.

 def bottleneck(x,filters_bottleneck,mode='cascade', depth=6,kernel_size=(3,3),activation='relu'):
   dilated_layers = []
   if mode == 'cascade':
      for i in range(depth):
        x = Conv2D(filters_bottleneck,kernel_size,activation=activation,padding='same',dilation_rate=2**i)(x)
        dilated_layers.append(x)
      return add(dilated_layers)
  elif mode == 'parallel':
     for i in range(depth):
       dilated_layers.append(Conv2D(filters_bottleneck,kernel_size,
                            activation=activation,padding='same',dilation_rate=2**i)(x))
       return add(dilated_layers)

To understand what the function bottleneck does, A pictorial explanation is given below. The highlighted portion is the bottleneck. This part of the network is between the contracting and expanding paths.

enter image description here

The function bottleneck accepts a parameter mode. Am confused with the parameter mode. Is the parameter mode part of the vocabulary of deep learning, if so, can you help me by providing additional resources to understand. And the same applies to the cascade in mode parameter too.

One Answer

That mode parameter is simply referring to how (what way) the bottleneck should be, what type of bottleneck to use.

There is nothing fancy to it. There is at least one usage in statistics of mode that I know of, as a way to describe some distributions, but this doesn't have anything to do with it.

As to the difference between cascade and parallel, in the former the layers pass the input to each other (like in the diagram), while in the latter the layers operate on the same input.

In general, cascade refers to the application of operations sequentially (e.g. in a cascade classifier). Parallel means that operations are performed independently and could, theoretically, be ran simultaneously (think about parallel operations for threads). Here, those layers are parallel since they all operate on the same input, and their output is then collected by subsequent layers.

Answered by Paul92 on December 8, 2020

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP