Explores collective gender, race and ethnicity-specific bias in the Diffusion Model Alignment Using Direct Preference Optimization Text2Image model.
-
Notifications
You must be signed in to change notification settings - Fork 0
sm-ak-r33/Bias-Detection-SDXL-DPO-Text2Image-Model
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
Explores collective gender, race 3 and ethnicity-specific bias in the Diffusion 4 Model Alignment Using Direct Preference 5 Optimization Text2Image model.
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published