-
Notifications
You must be signed in to change notification settings - Fork 595
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow individually publishable API client libraries. #1431
Conversation
Thanks for this 🎉 We should make sure we don't need to use var vision = require('google-cloud-vision');
var visionClient = vision();
// or more likely, some opts will be required:
var vision = require('google-cloud-vision');
var visionClient = vision({ projectId: '...', keyFilename: '...' });
// and what I would personally do to make it a one-liner:
var vision = require('google-cloud-vision')({ projectId: '...', keyFilename: '...' }); |
Ah yes, we don't need |
Can you write up a little "Here's how this all works" bit? Also, are there any similarities between the process outlined here and how this PR works? |
Will do.
In my PR I didn't really address the publish process as discussed in #1411. My PR is more about organizational changes required in the code itself to facilitate individually publishable API packages. I'll explain better in my summary that I will add to my PR description. |
I updated the PR description. |
Open questions are:
|
A couple of thoughts: I think we are going to go with scoped packages (https://docs.npmjs.com/misc/scope), so
We should be able to lose the var vision = require('google-cloud-vision'); Is anything complicating that?
I think the umbrella package should just be the next minor bump from where we are when this PR lands, as opposed to something like 1.0.0-beta. For the module packages, I would start out at 1.0.0 on each.
I'd like Travis to be involved, but pushing to The manual release process should look like this: Release the umbrella package# Start from freshly installed dependencies:
rm -rf node_modules && npm install
# Check for faulty code:
npm run lint && npm run test && npm run system-test
# Create a git tag and increment package.json
npm version major
# Publish to npm from a clean directory:
# (`rm -rf node_modules` is to work around an npm publish bug)
rm -rf node_modules && npm publish
# Push to GitHub:
git push origin master --follow-tags Release a module package# Start from a clean sub-module directory
cd lib/bigquery && rm -rf node_modules && npm install
# Check for faulty code:
npm run lint && npm run test && npm run system-test
# Create a git tag and increment package.json:
npm version major
# Publish to npm from a clean directory:
# (`rm -rf node_modules` is to work around an npm publish bug)
rm -rf node_modules && npm publish
# Push to GitHub:
git push origin master --follow-tags
# Because it was a major increment with breaking changes,
# we have to manually update the umbrella package's dependency
# on it:
cd ../../ # back to the umbrella package directory
npm uninstall --save @google-cloud/bigquery
npm install --save @google-cloud/bigquery
git commit -am 'Update @gcloud/bigquery'
# Release the umbrella package as:
# - a minor bump -- if the umbrella package is pre-1.0
# - a major bump -- if the umbrella package is post-1.0
# [ steps outlined in previous "Release the umbrella package" section ] Once we decide on a versioning pattern (what the umbrella package and each module package starts at), a more specific guide can be whipped up.
|
I'll fix the package names.
Well, some submodules need access to constructors from other modules, with the most common example being the If the overall service constructor is the exported module (i.e. ES2015 module syntax allows a // export {
// Storage as default,
// Storage,
// File
// }
exports.default = Storage;
exports.Storage = Storage;
exports.File = File; So you could do: import { File, Storage } from '@google-cloud/storage';
const storage = Storage({
// config
}); or import Storage from '@google-cloud/storage';
import { File } from '@google-cloud/storage';
const storage = Storage({
// config
}); or in CommonJS const CloudStorage = require('@google-cloud/storage');
const { File, Storage } = CloudStorage;
const storage = Storage({
// config
}); In general I prefer exporting constructors/factory functions, and letting the user instantiate, rather than the act of importing of a module having side effects (magically instantiated client with default options). Thoughts? |
I see that as our problem that we shouldn't make the user pay for. While gross, I'd prefer exposing the sub-type constructors on the parent type (
It wouldn't. The way to instantiate the module would be to invoke the function returned by the require: var vision = require('@google-cloud/vision');
var visionClient = vision();
// (aka)
var vision = require('@google-cloud/vision')(); |
Okay, I'll switch away from named exports. |
@@ -21,6 +21,19 @@ | |||
}, | |||
"excludeFiles": [ | |||
"system-test/data/*", | |||
"test/testdata/*" | |||
"test/testdata/*", | |||
"lib/bigquery/node_modules/", |
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
We originally used proxyquire, but switched away for some reason. As long as it's working now, I'm glad to ditch |
@@ -34,6 +34,7 @@ function runCodeInSandbox(code, sandbox) { | |||
timeout: 1000 | |||
}); | |||
} catch(err) { | |||
console.log(err.stack); |
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
Subscribed. Right now to use gcloud.compute I need to install all other APIs as well. It would be awesome if I can install only gcloud-compute with gce-images only. |
Thanks for the updates! 🍻 @callmehiphop what are your thoughts on the un-checked TODOs from the opening post? My thoughts: #1431 (comment) |
I am working on CI system right now, where release flow could be separated task connected or not with github context. If you up to try, I can invest my time to setup a release process example on fork and we can move from there. Btw: release process could be done from user defined system via SSH. Let me know what you think. |
@stephenplusplus Do you think I should further modify the directory structure to be something like what Babel has?
=>
|
@jmdobry that looks nice to me. I like keeping the tests close to the source. If someone is contributing a Bigtable bug fix, they wouldn't have to run 1,000 unrelated tests just to see if their change works. And working in an isolated context from the rest of the code sounds better to me. If you're cool with it, I'm totally on board. @Gormartsen that sounds pretty interesting. I'm down for taking a look at an example. Worth mentioning the project owners (cc: @jgeewax) would have to be involved in the decision to bring in a new tool. |
@jmdobry on the versioning, JJ advises to stay in |
Okay. |
Re-organized into package folders. |
@@ -7,6 +7,15 @@ | |||
|
|||
## Testing | |||
|
|||
### Setup |
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
Is the linking script going to block out any contributors on Windows machines? |
I found a really neat way to make the shell scripts cross-platform: https://github.com/shelljs/shelljs |
@stephenplusplus is this just waiting on the updated docs stuff? |
That's a better question for @jmdobry :) |
I think if the docs are the only thing holding this PR back, we should just merge this in and get any pending PRs resolved. If necessary we can time the release with the docs being finished. I have a feeling our JSON docs script is going to need quite a bit of work to accommodate the modular docs and I'd hate to delay this more than we need to. |
Sure, sounds good to me. Whenever @jmdobry gives the go-ahead, we can merge. The only thing that might be left is the integration of shelljs so Windows users can contribute. |
var env = require('./env'); | ||
var Dataset = require('../src/dataset.js'); | ||
var Table = require('../src/table.js'); | ||
var env = require('../../system-test/env.js'); |
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
I'll finalize this today so it can get merged. |
* Re-organize into package folders. (#1)
Done, rebased against The scripts I added are:
Sorry if my rebasing made life harder for other PRs based on this PR. |
Woo!
I think this means that it will also run when users
No problem, I knew I was in for it :) |
^ Ignore me, I got my |
Looks good to me. Made a PR with a couple tweaks that this needed to run locally: https://github.com/jmdobry/gcloud-node/pull/2 |
We found a Contributor License Agreement for you (the sender of this pull request), but were unable to find agreements for the commit author(s). If you authored these, maybe you used a different email address in the git commits than was used to sign the CLA (login here to double check)? If these were authored by someone else, then they will need to sign a CLA as well, and confirm that they're okay with these being contributed to Google. |
Just running some tests locally, then merging in. Yay! |
Merged in via 1ccd918 Thank you for all of the help! 👍 🎉 The docs are still in progress, but should be completed shortly. We'll start firing off the modules then. MODULES!!! |
Lots of discussion floating around on why a PR like this is necessary, like GoogleCloudPlatform/gcloud-common#138.
Summary of changes
tldr;
Move the code for each API client into its own publishable package folder within this repository (rename
lib/
topackages/
), change a lot of relative import paths to use named package imports, and switch to a@google-cloud/*
naming scheme.Long version
What is this PR? Right now
gcloud-node
combines all API client library code into a single installable NPM package. This means that to update client code for an individual API, the entiregcloud-node
package must be version bumped and re-published.This PR makes it possible for each sub-folder within the
packages/
folder to be published to NPM as its own package. Currently,lib/index.js
imports all of thelib/
sub-folders using relative paths, pulling that code directly into the publishedgcloud-node
package. Furthermore, individuallib/
sub-folders sometimes import code from each other using relative paths. In order forpackages/vision/
to be cleanly publishable as an individual NPM package, any code it imports from outside of itself (e.g.require('../storage/file.js')
orrequire('../lib/common/util.js'
), needs to be changed to named package imports (e.g.require('@google-cloud/storage').File
orrequire('@google-cloud/common').util
. This decouples the versioning and publishing of sub-packages from one another.With this change, some individual packages will be free to move forward toward 1.0.0 while others might need to stay back at alpha or beta versions. The actual version number of the
gcloud-node
becomes less meaningful, as now the "big" package will just depend on a bunch of individually versioned and published packages.Before
After
or
or
etc.
List of changes
package.json
"name"
from"gcloud"
to"google-cloud"
"dependencies"
and switch to depending on@google-cloud/storage
,@google-cloud/pubsub
, etc.lib/index.js
require
calls from using relative paths to using package names, e.g.require('./storage/index.js')
=>require('google-cloud-storage')
package.json
files for eachlib/<folder>
package.json
"version" field to 0.1.0../lib/common
) to use@google-cloud/common
, e.g.require('../lib/common/utils.js')
=>require('@google-cloud/common').util
@google-cloud/common
turned into a module, rather than dependents importing file paths.lib/
mockery
that doesn't like the symbolic links that are now necessary for local development, so I switched toproxyquire
, which works fine with symbolic links.scripts/
link.sh
, which sets up the symbolic links necessary for local developmentunlink.sh
, the reverse oflink.sh
publish.sh
script, referenced in each sub-packagepackage.json
file, which can publish a sub-package to NPM with the appropriate legal files included.