You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Feb 7, 2025. It is now read-only.
num_tokens=256, # must be equal to num_embeddings input of VQVAE
In this case, I think the transformer should actually have 256 (from the VQVAE) + 1 (for the begin of sentence token). @Ashayp31 do you agree this is the case here?
In the tutorial, the VQVAE is defined to have 256 codes in the latent space. The problem is that the Transformer is also defined to ahve 256.
GenerativeModels/tutorials/generative/2d_vqvae_transformer/2d_vqvae_transformer_tutorial.py
Line 328 in f6b7dce
In this case, I think the transformer should actually have 256 (from the VQVAE) + 1 (for the begin of sentence token). @Ashayp31 do you agree this is the case here?