You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm not sure if it's because of my hosting takahe under uWSGI, but no matter what I do I can't get /robots.txt to be updated with the contents of the environment variable:
$ set| grep ROB
TAKAHE_ROBOTS_TXT_DISALLOWED_USER_AGENTS='["AhrefsBot","SemrushBot","MJ12bot"]'
Edit: I'm doing python manage.py collectstatic --clear --noinput; python manage.py migrate as part of each update, and the template does not seem to be applied when regenerating /static/robots.txt
Has there been any relevant routing change since #478 was accepted?
The text was updated successfully, but these errors were encountered:
I'm not sure if it's because of my hosting takahe under uWSGI, but no matter what I do I can't get
/robots.txt
to be updated with the contents of the environment variable:Edit: I'm doing
python manage.py collectstatic --clear --noinput; python manage.py migrate
as part of each update, and the template does not seem to be applied when regenerating/static/robots.txt
Has there been any relevant routing change since #478 was accepted?
The text was updated successfully, but these errors were encountered: