TransWikia.com

How to Connect Convolutional layer to Fully Connected layer in Pytorch while Implementing SRGAN

Data Science Asked by LostAtlas on February 6, 2021

I was implementing the SRGAN in PyTorch but while implementing the discriminator I was confused about how to add a fully connected layer of 1024 units after the final convolutional layer enter image description here
My input data shape:(1,3,256,256)

After passing this data through the conv layers I get a data shape: torch.Size([1, 512, 16, 16])

Code:

class Discriminator(nn.Module):
  
  def __init__(self):
    super(Discriminator,self).__init__()
    self.sm = nn.Sigmoid()
    self.net = nn.Sequential(
        nn.Conv2d(3,64,3,padding=1),
        nn.BatchNorm2d(64),
        nn.LeakyReLU(0.2),

        nn.Conv2d(64,64,3,2,padding=1),
        nn.BatchNorm2d(64),
        nn.LeakyReLU(0.2),

        nn.Conv2d(64,128,3,padding=1),
        nn.BatchNorm2d(128),
        nn.LeakyReLU(0.2),

        nn.Conv2d(128,128,3,2,padding=1),
        nn.BatchNorm2d(128),
        nn.LeakyReLU(0.2),

        nn.Conv2d(128,256,3,padding=1),
        nn.BatchNorm2d(256),
        nn.LeakyReLU(0.2),

        nn.Conv2d(256,256,3,2,padding=1),
        nn.BatchNorm2d(256),
        nn.LeakyReLU(0.2),

        nn.Conv2d(256,512,3,padding=1),
        nn.BatchNorm2d(512),
        nn.LeakyReLU(0.2),

        nn.Conv2d(512,512,3,2,padding=1),
        nn.BatchNorm2d(512),
        nn.LeakyReLU(0.2),

        nn.Linear(<ADD AN INPUT SHAPE HERE>,1024),
        nn.LeakyReLU(0.2),
        nn.Linear(1024,1)
    )
    
  def forward(self,x):
    x = self.sm(self.net(x))
    x = (x)
    return x

One Answer

You simply reshape the tensor to (batch_size, n_nodes) using tensor.view(). In your specific case this would be x.view(x.size()[0], -1).

Answered by Oxbowerce on February 6, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP