-
Notifications
You must be signed in to change notification settings - Fork 16
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
[Issue #2841] Make the robots.txt dynamic so that it can mutate per e…
…nvironment (#3379) ## Summary Fixes #2841 ### Time to review: __5 mins__ ## Changes proposed We need to make the robots.txt dynamic per environment so that we can ban crawling of the lower environments but still support nuanced rules for upper environments. ## Context for reviewers As a follow up change we will also make the Sitemap dynamic as well since we don't want to point to Sitemaps outside of Production. Our Sitemap isn't currently ready for being utilized so not including it for now. ## Additional information Tested this locally and confirmed that we do have a dynamic robots.txt that reacts to changes to the environment variable at runtime, not build time. Not sure if this will all hook together as expected when it tries to run in AWS, so having this test for "dev" temporarily until we prove it is all likely to work in Prod.
- Loading branch information
Showing
4 changed files
with
47 additions
and
20 deletions.
There are no files selected for viewing
This file was deleted.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,42 @@ | ||
// initial rules were absorbed from static robots.txt file | ||
|
||
import type { MetadataRoute } from "next"; | ||
import { environment } from "src/constants/environments"; | ||
|
||
export const dynamic = "force-dynamic"; | ||
|
||
export default function robots(): MetadataRoute.Robots { | ||
return { | ||
rules: [ | ||
environment.ENVIRONMENT === "dev" // switching this temporarily to ensure that the variable is being set at AWS runtime as expected, will make it "prod" after confirming in Dev | ||
? { | ||
userAgent: "*", | ||
allow: "/", | ||
disallow: [ | ||
// don't disallow search for now as without a sitemap it's Google's only way of finding stuff | ||
// search is a high cost, low information subset of the opportunity page data, which is also available via the Sitemap (soon) | ||
// "/search", | ||
// Prevent crawling of Next.js build files. | ||
"/_next/", | ||
"/_next*", | ||
"/img/", | ||
"/*.json$", | ||
"/*_buildManifest.js$", | ||
"/*_middlewareManifest.js$", | ||
"/*_ssgManifest.js$", | ||
"/*.js$", | ||
// Prevent crawling of Next.js api routes. | ||
"/api/", | ||
// Prevent crawling of static assets in the public folder. | ||
"/public/", | ||
], | ||
} | ||
: { | ||
userAgent: "*", | ||
disallow: "/", | ||
}, | ||
], | ||
// our sitemap isn't ready yet | ||
// sitemap: "https://acme.com/sitemap.xml", | ||
}; | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters