Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding problem #5

Open
wants to merge 39 commits into
base: master
Choose a base branch
from

Conversation

madhu-aithal
Copy link
Contributor

Implemented the adding problem and compared the results of nnRNN, RNN, LSTM and expRNN.

madhu-aithal and others added 30 commits December 2, 2019 19:40
Copy link
Owner

@KyleGoyette KyleGoyette left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please clean up unused prints and commented lines.

# print("self.UppT: ", self.UppT)
# print("self.M: ", self.M)
# print("self.alphas[0]: ", self.alphas[0].size())
# print("self.thetas: ", self.thetas)
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please remove prints that are commented

@@ -84,6 +86,45 @@ def adding_problem_generator(N, seq_len=6, high=1, number_of_ones=2):
X = np.append(X_num, X_mask, axis=2)
return torch.FloatTensor(X), torch.FloatTensor(Y)


def generate_copying_sequence(T, labels, c_length):
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is this used for in the adding problem?

self.log_P.grad = (frechet_deriv - frechet_deriv.t()).triu(diagonal=1)
optimizer.step()
self.P.data = self._B(False)
self.P.grad.data.zero_()

# def get_theta_list(self):
# return self.theta_list

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove commented functions and test prints

# print("y: ", y.size(), y[0], x[0])
# print("onehot", args.onehot)
# print("x: ", x.size(), "y: ", y.size())
# print("x batch size: ", x.shape[1])
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove unnecessary prints

print(name, param.data)
# for name, param in net.named_parameters():
# if param.requires_grad:
# print(name, param.data)
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

remove comment lines

@@ -159,7 +161,7 @@ def forward(self, input,hidden):
# Load data
###############################################################################

corpus = Corpus('./data/pennchar/')
corpus = Corpus('//home/madhusudhan/ptb-data/')
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please remove references to your own file system

@@ -305,9 +307,9 @@ def train(optimizer, orthog_optimizer):
try:
exp_time = "{0:%Y-%m-%d}_{0:%H-%M-%S}".format(datetime.now())
SAVEDIR = os.path.join('./saves',
'sMNIST',
'language-task',
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is fixed in the newest master version. Merge master again.

scipy==1.3.1
torch==1.1.0
numpy==1.16.4
tensorboardX==1.9
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants