Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BUG: Size of the the latent should be the alphabet and not the number of fake words... #28

Open
ysig opened this issue May 27, 2022 · 1 comment

Comments

@ysig
Copy link

ysig commented May 27, 2022

Hey you seem to set len(sel.lex) as the size of the latent vector.

len(self.lex), device=self.device,

self.lex is the lexicon and as so taking it's size will return the number of words found in the lexicon.

But as latent vector you need an one-hot on the alphabet size.

Of-course this will still work as len(self.lex) >> len(opt.n_classes) but there is probably something wrong.

@ysig ysig changed the title Bug on the code BUG: Size of the the latent should be the alphabet and not the number of fake words... May 27, 2022
@rlit
Copy link
Contributor

rlit commented May 31, 2022

Hi
Thanks for your interest in this package.

You seem to be right, but since I am no longer able to edit this repo I cannot fix this.
@sharonFogel can you help?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants