Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow the same data to be shared by multiple Upload scalars #92

Merged
merged 25 commits into from
Aug 15, 2018
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
25 commits
Select commit Hold shift + click to select a range
7b15da6
Upgrade fs-capacitor to 1.0.0-alpha.2
mike-marcacci Jul 22, 2018
a21f10c
Add support for file deduplication
mike-marcacci Jul 22, 2018
51f8447
Add cleanup tests
mike-marcacci Jul 23, 2018
1058294
[email protected]
mike-marcacci Jul 23, 2018
132188b
More robust cleanup tests
mike-marcacci Jul 23, 2018
5309b93
Only destroy totally unconsumed read streams
mike-marcacci Jul 23, 2018
c3cc78f
Remove setMaxListeners
mike-marcacci Jul 23, 2018
b69f5ef
Release resources on node <10
mike-marcacci Aug 2, 2018
b2d5308
Initial mockup of new API
mike-marcacci Aug 2, 2018
be418a8
Match tests to new behavior
mike-marcacci Aug 2, 2018
1d345e1
Fix error order with node 8
mike-marcacci Aug 2, 2018
c8c49d9
update readme and changelog
mike-marcacci Aug 2, 2018
a2a2072
Reenable max file test
mike-marcacci Aug 2, 2018
902b886
simplify error handling
mike-marcacci Aug 3, 2018
09b2610
Fix test timings on linux
mike-marcacci Aug 3, 2018
5a999d9
Make sure to resume request after unpipe
mike-marcacci Aug 3, 2018
dd9c6b3
Destroy parser before unpiping/resuming
mike-marcacci Aug 3, 2018
ef813ce
Merge branch 'master' into dedup-files
jaydenseric Aug 6, 2018
b0a4da5
Tidy and typo fixes.
jaydenseric Aug 6, 2018
503cdc5
Changelog fixes.
jaydenseric Aug 6, 2018
81f7980
Test simplifications.
jaydenseric Aug 6, 2018
895855c
Use snapshots to better test upload promise resolved enumerable prope…
jaydenseric Aug 6, 2018
6d9ca37
Unskip test.
jaydenseric Aug 6, 2018
c257701
Reordered upload properties.
jaydenseric Aug 6, 2018
2dafb02
Synchronous scripts and tests.
jaydenseric Aug 6, 2018
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .prettierignore
Original file line number Diff line number Diff line change
@@ -1,2 +1,3 @@
package.json
package-lock.json
tap-snapshots
8 changes: 8 additions & 0 deletions changelog.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,13 +2,21 @@

## Next

### Major

- The `processRequest` function now requires a [`http.ServerResponse`](https://nodejs.org/api/http.html#http_class_http_serverresponse) instance as its second argument.
- `Upload` scalar promises now resolve with a `createReadStream` method instead of a `stream` property, via [#92](https://github.com/jaydenseric/apollo-upload-server/pull/92).

### Minor

- An `Upload` variable can now be used by multiple resolvers, via [#92](https://github.com/jaydenseric/apollo-upload-server/pull/92).
- Multiple `Upload` scalar variables can now use the same multipart data, via [#92](https://github.com/jaydenseric/apollo-upload-server/pull/92).
- Export a new `ParseUploadError` that is thrown with a `400` status when `operations` or `map` multipart fields contain invalid JSON, fixing [#95](https://github.com/jaydenseric/apollo-upload-server/issues/95).

### Patch

- Updated dev dependencies.
- Removed the [`npm-run-all`](https://npm.im/npm-run-all) dev dependency and made scripts and tests sync for easier debugging, at the cost of slightly longer build times.
- Configured Prettier to lint `.yml` files.
- Ensure the readme Travis build status badge only tracks `master` branch.

Expand Down
12 changes: 6 additions & 6 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@
},
"dependencies": {
"busboy": "^0.2.14",
"fs-capacitor": "^0.0.3",
"fs-capacitor": "^1.0.0",
"object-path": "^0.11.4"
},
"devDependencies": {
Expand All @@ -58,23 +58,22 @@
"koa": "^2.5.2",
"lint-staged": "^7.2.0",
"node-fetch": "^2.2.0",
"npm-run-all": "^4.1.3",
"prettier": "^1.14.0",
"tap": "^12.0.1",
"watch": "^1.0.2"
},
"scripts": {
"prepare": "FORCE_COLOR=1 npm-run-all prepare:clean -p prepare:mjs prepare:js -s prepare:prettier --aggregate-output",
"prepare": "npm run prepare:clean && npm run prepare:mjs && npm run prepare:js && npm run prepare:prettier",
"prepare:clean": "rm -rf lib",
"prepare:mjs": "BABEL_ESM=1 babel src -d lib --keep-file-extension",
"prepare:js": "babel src -d lib",
"prepare:prettier": "prettier 'lib/**/*.{mjs,js}' --write",
"test": "FORCE_COLOR=1 TAP_COLORS=1 run-p test:* -c --aggregate-output",
"test": "npm run test:eslint && npm run test:prettier && npm run test:mjs && npm run test:js",
"test:eslint": "eslint . --ext mjs,js",
"test:prettier": "prettier '**/*.{json,yml,md}' -l",
"test:mjs": "node --experimental-modules --no-warnings lib/test | tap-mocha-reporter spec",
"test:js": "node lib/test | tap-mocha-reporter spec",
"prepublishOnly": "run-s prepare test",
"prepublishOnly": "npm run prepare && npm test",
"watch": "watch 'npm run prepublishOnly --silent' src --interval 1",
"precommit": "lint-staged"
},
Expand All @@ -83,7 +82,8 @@
"*.{json,yml,md}": "prettier -l"
},
"eslintIgnore": [
"lib"
"lib",
"tap-snapshots"
],
"eslintConfig": {
"extends": [
Expand Down
20 changes: 14 additions & 6 deletions readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -79,22 +79,30 @@ app.use(

### Custom middleware

Middleware wraps the async function `processRequest` which accepts a Node.js request and an optional [options object](#options) as arguments. It returns a promise that resolves an operations object for a GraphQL server to consume (usually as the request body). Import it to create custom middleware:
To create custom middleware import the `processRequest` function:

```js
import { processRequest } from 'apollo-upload-server'
```

It has the following parameters:

- `request` ([http.IncomingMessage](https://nodejs.org/api/http.html#http_class_http_incomingmessage)): Node.js HTTP server request instance.
- `response` ([http.ServerResponse](https://nodejs.org/api/http.html#http_class_http_serverresponse)): Node.js HTTP server response instance.
- `options` ([options object](#options)): Options (optional).

It returns a promise that resolves an operations object for a GraphQL server to consume (usually as the request body).

### `Upload` scalar

A file upload promise that resolves an object containing:

- `stream`
- `filename`
- `mimetype`
- `encoding`
- `filename` (string): File name.
- `mimetype` (string): File MIME type.
- `encoding` (string): File stream transfer encoding.
- `createReadStream` (function): Returns a readable stream of the file contents. Multiple calls create independent streams. Throws if called after all resolvers have resolved, or after an error has interrupted the request.

It must be added to your types and resolvers:
Add it to your types and resolvers:

```js
import { makeExecutableSchema } from 'graphql-tools'
Expand Down
3 changes: 1 addition & 2 deletions src/errors.mjs
Original file line number Diff line number Diff line change
Expand Up @@ -24,5 +24,4 @@ export class MaxFilesUploadError extends UploadError {}
export class MapBeforeOperationsUploadError extends UploadError {}
export class FilesBeforeMapUploadError extends UploadError {}
export class FileMissingUploadError extends UploadError {}
export class UploadPromiseDisconnectUploadError extends UploadError {}
export class FileStreamDisconnectUploadError extends UploadError {}
export class DisconnectUploadError extends UploadError {}
146 changes: 78 additions & 68 deletions src/middleware.mjs
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
import Busboy from 'busboy'
import objectPath from 'object-path'
import Capacitor from 'fs-capacitor'
import WriteStream from 'fs-capacitor'
import {
SPEC_URL,
ParseUploadError,
Expand All @@ -9,8 +9,7 @@ import {
MapBeforeOperationsUploadError,
FilesBeforeMapUploadError,
FileMissingUploadError,
UploadPromiseDisconnectUploadError,
FileStreamDisconnectUploadError
DisconnectUploadError
} from './errors'

class Upload {
Expand All @@ -31,9 +30,18 @@ class Upload {

export const processRequest = (
request,
response,
{ maxFieldSize, maxFileSize, maxFiles } = {}
) =>
new Promise((resolve, reject) => {
let requestEnded = false
let released = false
let exitError
let currentStream
let operations
let operationsPath
let map

const parser = new Busboy({
headers: request.headers,
limits: {
Expand All @@ -44,14 +52,31 @@ export const processRequest = (
}
})

let operations
let operationsPath
let map
let currentStream

const exit = error => {
reject(error)
parser.destroy(error)
if (exitError) return
exitError = error

reject(exitError)

parser.destroy()

if (currentStream) currentStream.destroy(exitError)

if (map)
for (const upload of map.values())
if (!upload.file) upload.reject(exitError)

request.unpipe(parser)
request.resume()
}

const release = () => {
if (released) return
released = true

if (map)
for (const upload of map.values())
if (upload.file) upload.file.capacitor.destroy()
}

parser.on('field', (fieldName, value) => {
Expand Down Expand Up @@ -104,8 +129,6 @@ export const processRequest = (
for (const [fieldName, paths] of mapEntries) {
map.set(fieldName, new Upload())

// Repopulate operations with the promise wherever the file occurred
// for use by the Upload scalar.
for (const path of paths)
operationsPath.set(path, map.get(fieldName).promise)
}
Expand All @@ -119,7 +142,6 @@ export const processRequest = (
if (!map) {
// Prevent an unhandled error from crashing the process.
stream.on('error', () => {})

stream.resume()

return exit(
Expand All @@ -131,69 +153,68 @@ export const processRequest = (
}

currentStream = stream

stream.on('end', () => {
if (currentStream === stream) currentStream = null
})

if (map.has(fieldName)) {
const capacitor = new Capacitor()
const upload = map.get(fieldName)
if (upload) {
const capacitor = new WriteStream()

capacitor.on('error', () => {
stream.unpipe()
stream.resume()
})

stream.on('limit', () =>
stream.on('limit', () => {
if (currentStream === stream) currentStream = null
stream.unpipe()
capacitor.destroy(
new MaxFileSizeUploadError(
'File truncated as it exceeds the size limit.',
413
)
)
)
})

stream.on('error', error => {
if (capacitor.finished || capacitor.destroyed) return

// A terminated connection may cause the request to emit a 'close'
// event either before or after the parser encounters an error,
// depending on the Node.js version and the state of stream buffers.

if (
error.message ===
// https://github.com/mscdex/dicer/blob/v0.2.5/lib/Dicer.js#L62
'Unexpected end of multipart data' ||
error.message ===
// https://github.com/mscdex/dicer/blob/v0.2.5/lib/Dicer.js#L65
'Part terminated early due to unexpected end of multipart data'
)
error = new FileStreamDisconnectUploadError(error.message)
if (currentStream === stream) currentStream = null

capacitor.destroy(error)
stream.unpipe()
capacitor.destroy(exitError || error)
})

stream.pipe(capacitor)

map.get(fieldName).resolve({
stream: capacitor,
filename,
mimetype,
encoding
})
}
// Discard the unexpected file.
else {
upload.resolve(
Object.create(null, {
filename: { value: filename, enumerable: true },
mimetype: { value: mimetype, enumerable: true },
encoding: { value: encoding, enumerable: true },
createReadStream: {
value() {
const error = capacitor.error || (released ? exitError : null)
if (error) throw error

return capacitor.createReadStream()
},
enumerable: true
},
capacitor: { value: capacitor }
})
)
} else {
// Discard the unexpected file.
stream.on('error', () => {})
stream.resume()
}
})

parser.once('filesLimit', () => {
parser.once('filesLimit', () =>
exit(
new MaxFilesUploadError(`${maxFiles} max file uploads exceeded.`, 413)
)
})
)

parser.once('finish', () => {
request.unpipe(parser)
Expand All @@ -207,30 +228,19 @@ export const processRequest = (
)
})

parser.on('error', error => {
request.unpipe(parser)
request.resume()
parser.once('error', exit)

if (map)
for (const upload of map.values())
if (!upload.file) upload.reject(error)
response.once('finish', release)
response.once('close', release)

if (currentStream) currentStream.destroy(error)
request.once('end', () => {
requestEnded = true
})

request.on('close', () => {
if (map)
for (const upload of map.values())
if (!upload.file)
upload.reject(
new UploadPromiseDisconnectUploadError(
'Request disconnected before file upload stream parsing.'
)
)

if (!parser._finished)
parser.destroy(
new FileStreamDisconnectUploadError(
request.once('close', () => {
if (!requestEnded)
exit(
new DisconnectUploadError(
'Request disconnected during file upload stream parsing.'
)
)
Expand All @@ -245,7 +255,7 @@ export const apolloUploadKoa = options => async (ctx, next) => {
const finished = new Promise(resolve => ctx.req.on('end', resolve))

try {
ctx.request.body = await processRequest(ctx.req, options)
ctx.request.body = await processRequest(ctx.req, ctx.res, options)
await next()
} finally {
await finished
Expand All @@ -256,16 +266,16 @@ export const apolloUploadExpress = options => (request, response, next) => {
if (!request.is('multipart/form-data')) return next()

const finished = new Promise(resolve => request.on('end', resolve))
const { send } = response

const { send } = response
response.send = (...args) => {
finished.then(() => {
response.send = send
response.send(...args)
})
}

processRequest(request, options)
processRequest(request, response, options)
.then(body => {
request.body = body
next()
Expand Down
Loading