Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Project: Automated gh-pages publishing/pins to IPFS #371

Open
mikeal opened this issue Nov 28, 2018 · 6 comments
Open

Project: Automated gh-pages publishing/pins to IPFS #371

mikeal opened this issue Nov 28, 2018 · 6 comments

Comments

@mikeal
Copy link
Contributor

mikeal commented Nov 28, 2018

This is a project the Community WG would like to tackle in 2019 as part of evangelizing IPFS with developers.

The idea is simple: a bot (probot), or possibly a new "GitHub Action", we run that anyone can easily add to their repository that publishes the content of gh-pages on any successful merge into that branch.

There is already a big world of automated tools for dynamically building sites from a master branch and publishing the dist to ghpages, it's even built in to Travis. We can piggyback on all of that work and add publishing/pinning in to the decentralized web.

It may sound like a big deal to be pinning things indefinitely for people but we can do what GitHub does for ghpages and just limit the size of the archives.

@AuHau
Copy link
Member

AuHau commented Jan 29, 2019

Hey there,

I had exactly this need a few weeks ago, so I started a small project, sort of "IPFS CD publishing service": https://github.com/AuHau/ipfs-cd-publish

It is still WIP and hopefully this week it should be polished enough for the first release. It is meant as a self-hosted tool and definitely designed for small scale of few repos, nothing like "general hosting of gh-pages".

I would be interested to help you with this! Let me know if you would like my help.
Most probably my approach won't really work for your case, but I would not mind starting over.

@parkan
Copy link

parkan commented Jan 30, 2019

Seems like a great fit for a github integration, similar to what Zeit does

@mikeal
Copy link
Contributor Author

mikeal commented Jan 31, 2019

I’ve been messing around with GitHub Actions and I think we should definitely build this as a GitHub Action :)

The biggest thing to figure out is, how are we handling long term storage? Are we providing storage for anyone, how much storage per site, how are we tracking the storage per site across different publishes, etc.

The actually publishing part is quite simple with GitHub Actions once we have a one-liner that can pin the resources permanently somewhere.

@AuHau
Copy link
Member

AuHau commented Jan 31, 2019

Also, there is the issue of how the published content is supposed to be consumed. If it is supposed to be consumed by using dnslink, then there needs to be either option for updating the TXT-record or publishing under IPNS name.

Updating TXT-record could be possible but will be a bit tricky because of big variety of DNS providers. Also, it could be left up to the user to write some script that will update it (not really user-friendly).

I think a better approach would be using IPNS, which the user can then setup once and use forever. Keys could be stored as Action's secret. But there is a problem of IPNS record's expiration, there would have to be periodically renewed. Do GitHub Actions support periodical invocation?

@mikeal
Copy link
Contributor Author

mikeal commented Feb 1, 2019

I think we should separate the IPNS actions from the IPFS publish actions.

A simple IPFS publish/pin action could produce a gateway URL. That would be enough for people doing a build and publish on every commit in every branch. You could then build a separate Action for updating IPNS only on merges into master.

@AuHau
Copy link
Member

AuHau commented Feb 3, 2019

That is definitely a good idea to keep the IPNS as a separate action.

I was thinking about how you could approach the storage problem and it boiled down to two approaches.

First one is very similar to the one I took with my project. Have a tool with API endpoint, that is invoked with every new build. Upon invocation, it would clone the repo, check its size and any other constraints you would like to have and then eventually add it to IPNS and return the hash. This has an advantage, that you could automatically garbage-collect the old versions of the sites (if you would want to), but on the other hand, it is not a really decentralized solution.

The second way I could think of is having IPFS in Dockerfile and have GitHub itself add the content to IPFS and then there would be some endpoint on your side only for the pinning of the added content. This approach has some more possible issues, like how long cloning of the data to your cluster will take? How to ensure that the pinning service is not misused? How to garbage collect old data, or will you pin everything "for ever"? And so on...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants