Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Embedding with 0 masking #17

Closed
ktamiola opened this issue Jun 7, 2017 · 2 comments
Closed

Embedding with 0 masking #17

ktamiola opened this issue Jun 7, 2017 · 2 comments
Labels

Comments

@ktamiola
Copy link
Contributor

ktamiola commented Jun 7, 2017

Just an idea! Instead of weights.

From: https://keras.io/layers/embeddings/

mask_zero: Whether or not the input value 0 is a special "padding" value that should be masked out. This is useful when using recurrent layers which may take variable length input. If this is True then all subsequent layers in the model need to support masking or an exception will be raised. If mask_zero is set to True, as a consequence, index 0 cannot be used in the vocabulary (input_dim should equal size of vocabulary + 1).

@jandom
Copy link
Contributor

jandom commented Jun 7, 2017

We can used it to mask out the padding and hopefully also the missing data

@ktamiola
Copy link
Contributor Author

ktamiola commented Jun 7, 2017

@jandom exactly. Weights were an awesome remedy, but as you might have noticed, networks still were trying to predict points at the boundaries, which is a bit worrisome.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants