Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The embedding projection #41

Open
PANXiao1994 opened this issue Aug 21, 2018 · 1 comment
Open

The embedding projection #41

PANXiao1994 opened this issue Aug 21, 2018 · 1 comment

Comments

@PANXiao1994
Copy link

Hi, I have noticed that you have put the input projection before Highway Network. However, in the paper, it is mentioned that the input of Embedding Encoding Layer is a vector of dimension p1+p2=500 for each word, which means that the projection is placed after the Highway Network.

Have you already try this?

@localminimum
Copy link
Owner

localminimum commented Aug 26, 2018

Hi @PANXiao1994 , we have tried putting the projection after the highway network. However, we found it to be overfitting severely and it decreased the performance. If anyone else finds results different to what I observed, please let me know.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants