cd Moderator
conda env create --prefix moderator --file moderator.yaml
conda activate moderator
Install diffuser module from GitHub.
chmod +x ./init.sh
bash ./init.sh
curl -fsSL https://ollama.com/install.sh | sh
ollama pull llama3
export ModeratorWordDir="Input your work dir, the original dir for the clone."
First run the command below to start the backend
python main_backend.py
This will start a backend on flask on http://127.0.0.1:7417/ It will provide several interfaces:
- pretrain_img_generate: Pass the prompt to generate images on pretrained models. See example in AE_policy_result.py
- img_generate: Pass the prompt to generate images on moderated models. See example in AE_policy_result.py
- craft_config: Pass the config to generate policy. See example in AE_policy_craft.py You can craft scripts to use the interfaces, and you can also use our frontend interface.
Then you can access http://localhost:7417/index to use the frontend interface.