You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the idea
Instead of worrying about hosting and a datastore, we can utilize TravisCI and GitHub Pages.
We can set up a "cron" in TravisCI to automatically build the project every day. In doing so it will run the web scrapers and store their output in individual JSON files. We can then tell TravisCI to put those JSON files inside a data directory within the React app after it's built. Then the build artifacts get pushed to the gh-pages branch.
What would be required to accomplish this?
Simplifying the process! We just need to finish the crawlers and have them output JSON.
We could take this a step further. Netlify has built-in CI and integrates directly with GitHub. The downside here is that we'd have to use a shared account to manage it if we went that route.
I like the idea of just outputting JSON. How would that work for jobs? Would we just have a job that outputs new JSON each day?
Describe the idea
Instead of worrying about hosting and a datastore, we can utilize TravisCI and GitHub Pages.
We can set up a "cron" in TravisCI to automatically build the project every day. In doing so it will run the web scrapers and store their output in individual JSON files. We can then tell TravisCI to put those JSON files inside a data directory within the React app after it's built. Then the build artifacts get pushed to the
gh-pages
branch.What would be required to accomplish this?
Simplifying the process! We just need to finish the crawlers and have them output JSON.
Additional context
This would follow a similar process as https://github.com/OpenCLTBrigade/opencltbrigade.github.io
The text was updated successfully, but these errors were encountered: