You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm not sure if this issue supposed to be written here or in the https://github.com/spatie/robots-txt package !, but since I'm using this package and it depends on spatie/robots-txt. I will write it here.
I just discovered that all the URL's in my website is not indexed and blocked by robots.txt, after digging it out the only thing that I found overwriting the default robots.txt file in Laravel is spatie/robots-txt, I'm not using spatie/robots-txt directly I'm just using https://github.com/roach-php.
Any help or confirming this issue would be helpful.
The text was updated successfully, but these errors were encountered:
I looked into the spatie package, and it is only for parsing robots.txt, I believe it should not generate one in your webroot. Can you create a minimal reproducible example?
I'm not sure if this issue supposed to be written here or in the https://github.com/spatie/robots-txt package !, but since I'm using this package and it depends on spatie/robots-txt. I will write it here.
I just discovered that all the URL's in my website is not indexed and blocked by robots.txt, after digging it out the only thing that I found overwriting the default robots.txt file in Laravel is spatie/robots-txt, I'm not using spatie/robots-txt directly I'm just using https://github.com/roach-php.
Any help or confirming this issue would be helpful.
The text was updated successfully, but these errors were encountered: