We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
日志
Tue Sep 5 15:16:57 2017[1,37]<stdout>:Test with Pass 2, Cost 4.149080, {} Tue Sep 5 15:17:00 2017[1,34]<stdout>:Test with Pass 2, Cost 4.421460, {} Tue Sep 5 15:17:02 2017[1,46]<stdout>:Test with Pass 2, Cost 4.200185, {} Tue Sep 5 15:17:08 2017[1,39]<stdout>:Test with Pass 2, Cost 4.398165, {} Tue Sep 5 15:17:09 2017[1,17]<stdout>:Test with Pass 2, Cost 4.678599, {}
网络配置
def fc_net(dict_dim, class_dim=2): """ dnn network definition :param dict_dim: size of word dictionary :type input_dim: int :params class_dim: number of instance class :type class_dim: int """ # input layers data = paddle.layer.data("word", paddle.data_type.sparse_binary_vector(dict_dim)) label = paddle.layer.data("label", paddle.data_type.dense_vector(1)) # hidden h_size = 128 h = paddle.layer.fc( input=data, size=h_size, act=paddle.activation.Tanh()) # output layer output = paddle.layer.fc( input=h, size=1, act=paddle.activation.Linear()) cost = paddle.layer.smooth_l1_cost(input=output, label=label) return cost, output, label
训练参数
parameters = paddle.parameters.create(cost) # create optimizer adagrad_optimizer = paddle.optimizer.DecayedAdaGrad( learning_rate=0.01, regularization=paddle.optimizer.L2Regularization(rate=0.01), rho=0.95, epsilon=1e-6, ) # create trainer trainer = paddle.trainer.SGD( cost=cost, parameters=parameters, update_equation=adagrad_optimizer )
The text was updated successfully, but these errors were encountered:
这个 issue 不适合在这个 repo
Sorry, something went wrong.
No branches or pull requests
日志
网络配置
训练参数
The text was updated successfully, but these errors were encountered: