-
Notifications
You must be signed in to change notification settings - Fork 312
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Replaying POST requests #693
Comments
FWIW, the Cache API does not currently allow POST requests to be stored. You will get a TypeError. |
In browsers that support background sync, you probably want them to handle the I think it should be:
However, I agree that Request/Response should be able to go into IDB, as long as |
I think we need fetch body stream fully spec'd and something like whatwg/streams#276 before that could be solved. If the streamed body could be transferred, then in theory a structure clone for Request/Response could be defined. |
@jakearchibald @wanderview thanks! |
Once streams are fully specced we can start to think about how request/response could do into IDB |
Mozilla's Service Worker Cookbook example simply stores requests into localForage (which presumably picks it's IDB backend)- https://serviceworke.rs/request-deferrer_service-worker_doc.html I'd be afraid of adding nasty complexity trying to wrangle POST into Caching. |
I have came across below scenarios
I am not sure why both issues are considered in same manner. If not please help me to find out alternative way for 2 |
You should be able to store the body of the request into IDB or similar. |
If we wanted to provide a convenience for this use case maybe we could add a let c = await caches.open("foo");
// add something to a queue
await c.enqueueRequest(FetchEvent.req);
// pop the next item off the queue
let req = await c.dequeueRequest(url);
// drain the entire queue in one go
let reqList = await c.drainRequests(url); This would avoid the matching problem. All the posts match the same URL/vary combination. We just don't overwrite. Instead we queue the requests and don't have a response associated at all. |
POST would be handy for graphql users |
Pre TPAC thoughts:
|
Any progress on this? The value of service workers is limited if they are restricted to GETs. A modern web app is rarely simply used for retrieving information, so POST, PATCH, PUT, DELETE should be something such an app can handle even while offline. |
@simondrabble sorry for the delay in picking this up. Can you talk a little bit about how you'd use this feature, and why you can't store the blob in IDB along with the header metadata? |
Fwiw, you can do: const method = request.method;
const headers = [...request.headers];
const body = await request.blob();
const idbData = { method, headers, body }; …and you can now store const { method, headers, body } = idbData;
const request = new Request(body, { method, headers }); That should cover most cases. |
@jakearchibald one problem with using IDB is that it's much more complicated than the simplicity of CacheStorage. Also, having to deconstruct request objects and reconstruct them later is very fiddly and error-prone. A simpler (yet very hacky) approach I found was to store all my POST requests in a separate cache storage instance and just change the method on the way in and out. const cache = await caches.open(ACTION_CACHE);
cache.put(new Request(request, {method: "GET"}), new Response()); and retrieving them all for processing later looks like: const cache = await caches.open(ACTION_CACHE);
const actionQueue = await cache.keys();
// cache.keys doesn't return requests in date order, so need to sort them
actionQueue.sort((a,b) => {
return getRequestTime(a) - getRequestTime(b);
});
return actionQueue.map(request => new Request(request, {method: "POST"})); I've not tested this extensively, but it works for my particular use case (my requests are all very simple and don't have custom bodies or headers). Hope someone finds that useful. I know this is abusing CacheStorage for something it's not made for. But CacheStorage is the closest thing I've found to a persistent FIFO queue of native Request objects. |
If we wanted something more convenient than IDB I think a new, separate "request queue" API would be more appropriate than using cache_storage. BTW, your snippet above will not preserve request bodies. |
I guess you could store that in the response body in the cache, but it all feels very hacky compared to using IDB. @lucas42 Have you seen https://www.npmjs.com/package/idb for making IDB easier? |
I had not, thanks! Any library with Douglas Adams references in its examples is definitely worth investigating 😄 |
@jakearchibald I ended up going a slightly different route and breaking apart the request to store in idb. Not super clean by my standards but it works. How I'd use the feature is pretty much analogous to GET requests, but it's been a few months since I've worked on that section of code and I don't recall much of the specifics of how I was trying to use ServiceWorker. |
Let's consider offline-first survey application.
When users performs some /GET requests SW will respond with cached content
But when users performs /POST request situation get's complicated. I want to:
How in your opinion replaying requests should be performed? I would like ServiceWorker to be a programmable network proxy, so I would like ServiceWorker to handle whole process - application should just send one request and don't care anymore. But sending a POST request is meant to change something on server - we have to make user (application) aware of the fact that it was failed but successfully cached.
Jeff's response to this issue is OK, but he also suggests that application should be responsible for replaying the requests - and I believe it'd be better to separate these layers (for Background Sync).
The text was updated successfully, but these errors were encountered: