Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: Check links with lychee #223

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open

chore: Check links with lychee #223

wants to merge 2 commits into from

Conversation

scsmithr
Copy link
Member

Adds an npm script for running lychee against a locally running site.

https://github.com/lycheeverse/lychee

Adds an npm script for running lychee against a locally running site.
Copy link
Contributor

@greyscaled greyscaled left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd maybe add it to the check:all script as well

@tychoish
Copy link
Collaborator

I worry a little bit about this being flaky when run a lot if sites block robots or similar, if this ends up going in CI/blocking a deploy that could be annoying.

@scsmithr
Copy link
Member Author

I'd maybe add it to the check:all script as well

Needs a running server which I think would trigger an unexpected failure.

I wrote this script:

#!/usr/bin/env bash

# Spins up the docs site (on a non-default port) and runs lychee against it to
# check for broken links.

set -e

bundle exec jekyll serve --port 4001 > /dev/null 2>&1 &
pid=$!

attempts=0
until curl --output /dev/null --silent --head --fail http://localhost:4001; do
    attempts=$((attempts+1))
    if [ $attempts -ge 10 ]; then
        echo "max attempts exceeded"
        exit 1
    fi
    echo "waiting for server"
    sleep 1
done

lychee -v http://localhost:4001

kill $pid

Which would spin up a site on 4001 and check against that. I was going to check that in and run it in ci, but got kinda lazy with actually getting lychee installed (I'm yamled out).

I worry a little bit about this being flaky when run a lot if sites block robots or similar, if this ends up going in CI/blocking a deploy that could be annoying.

The goal is really to check links against our site & docs. It's easy enough to exclude external sites as necessary. I just wanted something that we could use to quickly check that our links are working, and give us confidence when moving/updating docs.

I plan on setting this up on glaredb.com too since we have links to docs in the blog posts, and I'd rather be proactive about updating links than not know we broke them in the first place.

@tychoish
Copy link
Collaborator

should have it run more in a cron on the main site just so that broken docs links won't linger too long in the main site, but this seems reasonable,

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants