Investigate and create robots.txt #99
Labels
documentation
Improvements or additions to documentation
enhancement
New feature or request
low impact
Google has their own robots.txt at https://www.google.com/robots.txt.
This is to let web scrapers know which part of info can or cannot be collected. This might be useful for certain people to see what all are the available endpoints.
The text was updated successfully, but these errors were encountered: