TransWikia.com

Variational Autoencoder (VAE) latent features

Data Science Asked on March 8, 2021

I’m new to DL and I’m working on VAE for biomedical images. I need to extract relevant features from ct scan. So I created first an autoencoder and after a VAE. My doubt is that I don’t know from which layer I can extract feautures. My personal idea is to use features extracted by layers that compute the mean and variance (before reparameterization trick), but I think that also the layer before these is suitable for the purpose. I left here code of the encoder part:


class Sampling(tf.keras.layers.Layer):
    """Uses (z_mean, z_log_var) to sample z, the vector encoding a digit."""
    def call(self, inputs):
        z_mean, z_log_var = inputs
        batch = tf.shape(z_mean)[0]
        dim = tf.shape(z_mean)[1]
        epsilon = tf.keras.backend.random_normal(shape=(batch, dim))
        return z_mean + tf.exp(0.5 * z_log_var) * epsilon

def Encoder():
    inp = tf.keras.Input(shape=(32,256,256,1)) # prima era 64

    #enc = tf.keras.layers.Conv3D(8, (2,2,2), activation = 'relu', padding = 'same')(inp)
    #enc = tf.keras.layers.MaxPooling3D((2,2,2), padding = 'same')(enc)
 
    enc = tf.keras.layers.Conv3D(16, (2,2,2), activation = 'relu', padding = 'same')(inp)
    enc = tf.keras.layers.MaxPooling3D((2,2,2), padding = 'same')(enc)
 
    enc = tf.keras.layers.Conv3D(32, (2,2,2), activation = 'relu', padding = 'same')(enc)
    enc = tf.keras.layers.MaxPooling3D((2,2,2), padding = 'same')(enc)
 
    enc = tf.keras.layers.Conv3D(64, (2,2,2), activation = 'relu', padding = 'same')(enc)
    enc = tf.keras.layers.MaxPooling3D((2,2,2), padding = 'same') (enc)

    enc = tf.keras.layers.Conv3D(32, (2,2,2), activation = 'relu', padding = 'same')(enc)
    enc = tf.keras.layers.MaxPooling3D((2,2,2), padding = 'same') (enc)
    #enc = tf.keras.layers.Flatten()(enc)
    enc = tf.keras.layers.Conv3D(16, (2,2,2), activation = 'relu', padding = 'same')(enc)
    enc = tf.keras.layers.MaxPooling3D((2,2,2), padding = 'same') (enc)
    '''
    # conv 2D 
    code = tf.keras.layers.Reshape((8,8,96)) (enc)
    code = tf.keras.layers.Conv2D(96,(2,2), activation = 'relu', padding = 'same')(code)
    code = tf.keras.layers.MaxPooling2D((2,2), padding = 'same') (code)
    '''
    
    # latentent code vae
    latent_code = tf.keras.layers.Flatten()(enc)
    latent_code = tf.keras.layers.Dense(256, activation='relu')(latent_code)
    latent_mu = tf.keras.layers.Dense(32, activation='relu')(latent_code) # èprima era 10
    latent_sigma = tf.keras.layers.Dense(32, activation='relu')(latent_code) # prima era 10
    # Reparameterization trick
    #z = tf.keras.layers.Lambda(sample_z, output_shape=(128,), name='z')([latent_mu, latent_sigma])
    z = Sampling()([latent_mu, latent_sigma])
    encoder = tf.keras.Model(inp, [latent_mu, latent_sigma, z ], name = 'encoder')
    
    #encoder = tf.keras.Model(inp, enc)#[latent_mu, latent_sigma, z ], name = 'encoder')
    return encoder

```

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP