-
Notifications
You must be signed in to change notification settings - Fork 59
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Convert to GAPIC #35
Convert to GAPIC #35
Conversation
8ab32e1
to
aa42d08
Compare
Codecov Report
@@ Coverage Diff @@
## master #35 +/- ##
======================================
Coverage 100% 100%
======================================
Files 9 9
Lines 1004 1199 +195
======================================
+ Hits 1004 1199 +195
Continue to review full report at Codecov.
|
@kolodny this is still a work in progress, but would you mind trying to run the "rows" system tests in The problem seems to be that where we received a string version of a Buffer before, we now receive a Buffer directly from GAPIC. For example, this is a row that is handled by ChunkTransformer, before and after: // Before
{ chunks:
[ { row_status: null,
rowKey: 'YWxpbmNvbG4=',
familyName: [Object],
qualifier: [Object],
timestampMicros: '1518711386138000',
labels: [],
value: 'AAAAAAAAAAE=',
valueSize: 0,
resetRow: false,
commitRow: false },
{ row_status: null,
rowKey: '',
familyName: null,
qualifier: [Object],
timestampMicros: '1518711386138000',
labels: [],
value: 'AAAAAAAAAAE=',
valueSize: 0,
resetRow: false,
commitRow: false },
{ row_status: 'commitRow',
rowKey: '',
familyName: null,
qualifier: [Object],
timestampMicros: '1518711386138000',
labels: [],
value: 'AAAAAAAAAAE=',
valueSize: 0,
resetRow: false,
commitRow: true } ],
lastScannedRowKey: '' }
// After
{ chunks:
[ { labels: [],
rowKey: <Buffer 61 6c 69 6e 63 6f 6c 6e>,
familyName: [Object],
qualifier: [Object],
timestampMicros: '1518711494344000',
value: <Buffer 00 00 00 00 00 00 00 01>,
valueSize: 0 },
{ labels: [],
rowKey: [],
familyName: null,
qualifier: [Object],
timestampMicros: '1518711494344000',
value: <Buffer 00 00 00 00 00 00 00 01>,
valueSize: 0 },
{ labels: [],
rowKey: [],
familyName: null,
qualifier: [Object],
timestampMicros: '1518711494344000',
value: <Buffer 00 00 00 00 00 00 00 01>,
valueSize: 0,
commitRow: true,
rowStatus: 'commitRow' } ],
lastScannedRowKey: [] } There are lots of methods that seem to handle converting between strings and Buffers, and I'm wondering if these are still necessary, or if this new response type is problematic for other reasons. Please take a look and feel free to make any corrections necessary. Your help will surely be quicker and more accurate. Let me know if you have any questions about the new GAPIC layer or run into other issues. |
@ajaaym, can you also please take a look? |
aa42d08
to
30dd868
Compare
src/table.js
Outdated
currentRetryAttempt: numRequestsMade, | ||
}, | ||
}; | ||
// @todo Figure out how to do this in gapic. |
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
30dd868
to
970aa04
Compare
@@ -127,6 +127,11 @@ class BigtableInstanceAdminClient { | |||
'nextPageToken', | |||
'appProfiles' | |||
), | |||
listClusters: new gax.PageDescriptor( |
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
7b06281
to
e6a7b39
Compare
I've updated the PR as far as I think I can go without some help:
|
@stephenplusplus I believe you needed to fake out slightly more than just setTimeout. To be honest, I'm not sure why the read-rows work. I have a proof of concept at kolodny@eaff5c3 I made the patching a little more aggressive but I'm not sure that's the best approach either. We can always completely remove the fake timers but we'd need to set the mocha timeout to at least a minute to take into account the exponential backoffs. Let me know if that unblocks the system tests. |
The patch concept is fine with me. Although, could you write a Windows-compliant one? I'm not sure exactly what the resulting patched file is meant to look like. |
Sure f6d6dea...kolodny:pr-35 has a cross platform patching mechanism. It just replaces the file (wherever it finds it) with the patched version |
Patching is probably the wrong term for it now since we're overwriting the file completely, so feel free to rename it to "fixed" or something similar. |
Thanks for doing that. I've tried applying the changes from that commit, but end up with 102 errors from the system test run that look like:
I've confirmed that the patched file does overwrite the originals inside the node_modules directory. |
Strange it works for me. What is
And if that is also found, can you try just doing something simple with var processNextTick = require('process-nextick-args');
processNextTick(console.log, 'test'); |
Yep.
Yep.
> require('process-nextick-args')(console.log, 'test')
undefined
> test I noticed that the top of _stream_readable.js requires: var processNextTick = require('process-nextick-args').nextTick; But our patch doesn't export module.exports = function() {
return process.nextTick.apply(this, arguments);
}; |
Oh sorry, maybe I answered question 2 incorrectly. It does require the module, but it looks for the |
Interesting, it looks like the |
I deleted the patching pieces, although I can't be sure that was a good move. The system tests began to pass again, up until |
Weird, on my system I only get the "bad" version of it. Try changing the patch code to something like this: const fs = require('fs');
const glob = require('glob');
const patched = `${__dirname}/patched-process-nextick-args.js`;
glob.sync('./**/process-nextick-args/package.json').forEach(file => {
const majorVersion = require(file).version.slice(0, 1);
if (majorVersion <= 1) {
fs.copyFileSync(patched, file);
}
}); |
I had to re-work the script a bit to work, but the same early exiting without any output happens for me. revised script: const fs = require('fs');
const glob = require('glob');
const path = require('path');
const patched = `${__dirname}/patched-process-nextick-args.js`;
glob.sync('./**/process-nextick-args').forEach(pnaDirectoryPath => {
const packageJson = require(path.join('../', pnaDirectoryPath, 'package.json'));
const majorVersion = parseInt(packageJson.version[0], 10);
if (majorVersion <= 1) {
fs.copyFileSync(patched, path.join(pnaDirectoryPath, 'index.js'));
}
}); |
969ec8d
to
10db888
Compare
Updated to remove the pagination behavior from bigtable.getInstances() and instance.getClusters(). See the additions to the diff section from my first post. This should be ready to go. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I left a couple of comments but those shouldn't block this PR.
Look good to me!
* //- | ||
* // If the callback is omitted, we'll return a Promise. | ||
* //- | ||
* cluster.create().then(function(data) { |
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
src/index.js
Outdated
baseUrl: adminBaseUrl, | ||
path: 'google/bigtable/admin/v2/bigtable_table_admin.proto', | ||
service: 'bigtable.admin.v2', | ||
var options_ = extend( |
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
}, | ||
function(err, resp) { | ||
if (err) { | ||
callback(err); |
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
|
||
this.getMetadata(gaxOptions, function(err) { | ||
if (err) { | ||
if (err instanceof RowError) { |
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
@@ -1036,8 +1136,8 @@ Table.prototype.mutate = function(entries, callback) { | |||
var entryToIndex = new Map(entries.map((entry, index) => [entry, index])); | |||
var mutationErrorsByEntryIndex = new Map(); | |||
|
|||
function onBatchResponse(previousNumRequestsMade, err) { | |||
if (previousNumRequestsMade === numRequestsMade && err) { |
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
@@ -275,21 +249,21 @@ Row.formatFamilies_ = function(families, options) { | |||
* var apiResponse = data[0]; | |||
* }); | |||
*/ | |||
Row.prototype.create = function(entry, callback) { | |||
Row.prototype.create = function(options, callback) { |
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
test/filter.js
Outdated
@@ -42,8 +42,7 @@ describe('Bigtable/Filter', function() { | |||
}); | |||
|
|||
afterEach(function() { | |||
sinon.restore(); |
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
LGTM as well. It looks like |
@sduskis all of the breaking changes are noted in the opening post (#35 (comment)) -- let me know if any of those look like a problem. |
); | ||
assert.deepStrictEqual(requestedOptions, test.request_options); | ||
|
||
setImmediate(function() { |
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
Fixes #23
Blockers
Breaking Changes
Todos
gaxOptions
gaxOptions
gaxOptions
gaxOptions
gaxOptions
gaxOptions
gaxOptions
gaxOptions
gaxOptions
gaxOptions
gaxOptions
gaxOptions