Replies: 3 comments 2 replies
-
It could be due to caching? We cache this page for 24h by default, so you won't see the changes until the next caching cycle. Is this the case? export const loader = ({request}: LoaderArgs) => {
const url = new URL(request.url);
return new Response(robotsTxtData({url: url.origin}), {
status: 200,
headers: {
'content-type': 'text/plain',
// Cache for 24 hours. <--------------------------
'cache-control': `max-age=${60 * 60 * 24}`,
},
});
}; |
Beta Was this translation helpful? Give feedback.
-
The solution was to simply was to switch the Hydrogen site. Oxygen by default prevents crawling of the site when in test mode. |
Beta Was this translation helpful? Give feedback.
-
Here is some documentation: "https://shopify.dev/docs/storefronts/headless/hydrogen/seo#robots-txt" Shopify states: If you make a non-production deployment accessible with a shareable link or an auth bypass token, then Oxygen overrides the deployment's robots.txt file with a disallow rule for all bots and crawlers. This prevents exposing content prematurely, as well as potential SEO harm from duplicated content." |
Beta Was this translation helpful? Give feedback.
-
Hello, I have this robots.txt that I haven't modified from the current version of hydrogen storefront codebase in my robots.txt.
When I run on localhost and go to localhost:3000/robots.txt I can see the file properly.
However, when I go to the site's Hydrogen storefront URL, I'm getting this.
I'm quite new to SEO and this is an important functionality of our site, so I wanted to check how we could fix this?
Beta Was this translation helpful? Give feedback.
All reactions