Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

spatie/robots-txt overwrites default Laravel robots.txt #77

Open
Xoshbin opened this issue Dec 7, 2022 · 1 comment
Open

spatie/robots-txt overwrites default Laravel robots.txt #77

Xoshbin opened this issue Dec 7, 2022 · 1 comment

Comments

@Xoshbin
Copy link

Xoshbin commented Dec 7, 2022

I'm not sure if this issue supposed to be written here or in the https://github.com/spatie/robots-txt package !, but since I'm using this package and it depends on spatie/robots-txt. I will write it here.
I just discovered that all the URL's in my website is not indexed and blocked by robots.txt, after digging it out the only thing that I found overwriting the default robots.txt file in Laravel is spatie/robots-txt, I'm not using spatie/robots-txt directly I'm just using https://github.com/roach-php.
Any help or confirming this issue would be helpful.

@amenk
Copy link

amenk commented Jun 20, 2023

I looked into the spatie package, and it is only for parsing robots.txt, I believe it should not generate one in your webroot. Can you create a minimal reproducible example?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants