Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

test_GRU如何计算F1值 #133

Open
ghost opened this issue Feb 22, 2021 · 2 comments
Open

test_GRU如何计算F1值 #133

ghost opened this issue Feb 22, 2021 · 2 comments

Comments

@ghost
Copy link

ghost commented Feb 22, 2021

您好,根据代码中对测试集的处理,假设
test_sen = {(实体1,实体2): [[句子1],[句子2],[句子3]]}
test_ans = {(实体1,实体2): [1,0,0,1,0,0,0,0,0,0,0,1]}
该如何进行F1值的计算呢?

@crownpku
Copy link
Owner

看下下面的帖子是否能帮到你:
http://www.crownpku.com/2020/03/02/Evaluation-of-Named-Entity-Recognition-Model.html

@ghost
Copy link
Author

ghost commented Feb 22, 2021

看下下面的帖子是否能帮到你:
http://www.crownpku.com/2020/03/02/Evaluation-of-Named-Entity-Recognition-Model.html

您好,您推荐的帖子已看。如果我将测试集的数据处理成和训练集数据一样的格式:
test_sen = {entity pair:[[[label1-sentence 1],[label1-sentence 2]...],[[label2-sentence 1],[label2-sentence 2]...]}
test_ans = {entity pair:[label1,label2,...]}
然后利用下面代码求F1值:
f1 = f1_score(np.argmax(test_y, axis=1), predictions, labels=np.array(range(len(relation2id))), average="macro")
这样处理测试集求解F1值是否可行呢?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant