-
Notifications
You must be signed in to change notification settings - Fork 213
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement bulk moderation actions #4654
Conversation
I bumped up the priority to 🟧 high because the issue it resolves (#3840) is blocking three others. |
@dhruvkb all of the core moderation actions worked as described 🎉. This is awesome! As a future moderator I can see this being really useful. I found one thing, the "Resolved" filtered view, which seems broken. Also, I do not feel completely qualified to evaluate the python code for performance or for being idiomatic with Django. You might want to ping another reviewer for those concerns. But I am happy to approve this once you clarify the "Resolved" filtered view. Thanks! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great work here. There are a ton of things happening, so it's incredible to see the view workflows going smoothly.
As the test
user you provided in the instructions, I couldn't see the bulk moderation operations as expected ✔️ So for trying the bulk actions I assume the user must be granted the specific permissions, right?
I was able to mark images as sensitive and deindex them in bulk with the original deploy
user. I have yet to try the other actions, but given the many complex operations involving ES, groups of media, and specific user permissions, it seems this requires some automated testing. At the very least, it would be prudent to test the critical bulk functions.
Aside from that, this looks promising! I'll come back to this later.
Note that bulk moderation will not resolve any open reports. It is | ||
up to the moderator to manually link open reports with the | ||
appropriate decisions and resolve them. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a surprise to me, I didn't think the IP defined this like this (I must admit I have to read it carefully). Sounds like a lot of manual work that could be taken out of the moderators 🤔
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, this is a bit of manual work that can be automated if we auto-close reports. However, there are two main issues preventing that.
The main case we need to avoid that I can think of is if we marked an image as sensitive based on one report, but another report (that gets closed for having the same reason) actually contains information or context which would lead us to deindex the image.
— @zackkrida
Also need to keep in mind reversing a bulk action — if creating one results in reports being resolved, reversing it ought to reopen them
— @stacimc
With this limitation in mind, I decided to keep the reports unchanged for now and potentially revisit this in the future.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ideally, with moderation happening regularly, the need to triage multiple concurrent reports of a single work will be very infrequent.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does that mean we should implement report closing functionality as a part of bulk moderation? It can be pretty complex in terms of logic so I would ideally like that to be a different issue-PR pair to avoid complicating this PR further.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree that it would increase complexity, and this PR is already well-rounded. Good to know there was a previous consensus 👍
@zackkrida I just confirmed that it works as I expect. To check that view, open a media item from the "All" view, create a report for it and create a decision for that report. Now that media item will appear in "Resolved". |
@krysal, yes, the bulk moderation capability is available only to members of the "Content Moderators" group. It is determined by whether they have the "Can add image decision" and "Can add audio decision" permissions.
I agree with you there. I have added unit tests for the ES bulk update code in ede4e38 and for the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Approving based on the functionality, and the test logic looks good. Again, I don't feel the most qualified to speak to the specific code patterns or Django best practices.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is working like a charm! I tested all the cases, and the only thing I noticed is that, when marking as sensitive, the media gets a value in "unstable__sensitivity" of ["user_reported_sensitivity"]
when that didn't come from a user report, but that is a minor thing that can be addressed separately.
Excellent work 💯
Note that bulk moderation will not resolve any open reports. It is | ||
up to the moderator to manually link open reports with the | ||
appropriate decisions and resolve them. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree that it would increase complexity, and this PR is already well-rounded. Good to know there was a previous consensus 👍
Co-authored-by: Krystle Salazar <[email protected]>
Fixes
Fixes #3840 by @stacimc
Fixes #3841 by @stacimc
Partially addresses #3842 by @stacimc
Partially addresses #3843 by @stacimc
Description
This PR
Testing Instructions
Note that while the instructions only say images, the same steps can be taken for audio models.
Permissions
We must test that bulk moderation actions are only visible to moderators and not to other staff members.
just api/ipython
Marking sensitive
true
.Re-marking sensitive
Deindexing
Unmarking sensitive
false
.Reindexing
History
moderator
should be you.notes
should be the content entered on the confirmation page.Checklist
Update index.md
).main
) or a parent feature branch.ov just catalog/generate-docs
for catalogPRs) or the media properties generator (
ov just catalog/generate-docs media-props
for the catalog or
ov just api/generate-docs
for the API) where applicable.Developer Certificate of Origin
Developer Certificate of Origin