Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

test : lint and few minor fixes #17549

Closed
wants to merge 1 commit into from
Closed

test : lint and few minor fixes #17549

wants to merge 1 commit into from

Conversation

mithunsasidharan
Copy link
Contributor

@mithunsasidharan mithunsasidharan commented Dec 8, 2017

Lint and some other minor fixes. Files modified :

  • test/parallel/test-assert.js
  • test/parallel/test-whatwg-url-setters.js
Checklist
  • make -j4 test (UNIX), or vcbuild test (Windows) passes
  • tests and/or benchmarks are included
  • commit message follows commit guidelines
Affected core subsystem(s)

test

@nodejs-github-bot nodejs-github-bot added the test Issues and PRs related to the tests. label Dec 8, 2017
Copy link
Member

@apapirovski apapirovski left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @mithunsasidharan — unfortunately I don't think these changes can get merged. The first file — those comments reference github issues that they're testing. The second file includes the following comment "The following tests are copied from WPT. Modifications to them should be upstreamed first."

@mithunsasidharan
Copy link
Contributor Author

@apapirovski : As for the 1st file, I will remove it, but for the 2nd file changes, those were corrections as requested here. Can you confirm once ? If so, I'll close the PR !

@apapirovski
Copy link
Member

apapirovski commented Dec 8, 2017

@mithunsasidharan It was likely an oversight on the part of the person compiling that list. I would also mention that those changes are intended for Node.js Code and Learn events. They're intentionally simple things that can be used to introduce new contributors to the process of making their first contribution to Node.js.

You're of course more than welcome to work on those things but in general I would recommend finding issues within the issue tracker or looking at the https://coverage.nodejs.org/ to find gaps in our test coverage.

Also, please don't take this as me discouraging you from contributing. I truly appreciate all the PRs! Perhaps just trying to gently push you in a direction that you would likely find more engaging and would be more long-term beneficial. :)

Also, don't hesitate to let me know if you need any assistance with writing or expanding tests. I'm happy to assist in any way that I can.

@mithunsasidharan
Copy link
Contributor Author

please don't take this as me discouraging you from contributing.

@apapirovski Absolutely not, in fact its due to you and other reviewers continued support that I've been able to contribute whatever little I've. Definitely going to pick up test coverage next. Kindly suggest any reference doc or instructions I can follow to get started !

Thanks again for all your amazing support 👍 😄

@mithunsasidharan
Copy link
Contributor Author

@apapirovski : I'll close this PR !

@apapirovski
Copy link
Member

Feel free to check out the following labels:

https://github.com/nodejs/node/issues?q=is%3Aopen+is%3Aissue+label%3A%22good+first+issue%22
https://github.com/nodejs/node/issues?utf8=%E2%9C%93&q=is%3Aopen+is%3Aissue+label%3A%22help+wanted%22+

Also, re: code coverage, the site at https://coverage.nodejs.org/ has nightly updated coverage for the master branch. If you click through to the JS coverage (today's: https://coverage.nodejs.org/coverage-06e1b0386196f8f8/index.html), you'll note that it's got links for all the folders and files within lib. Feel free to browse through it and look for anywhere that's red or yellow, which means that code or branch is not covered by any tests.

For example, the emitExperimentalWarning function in lib/internal/util.js is currently not covered by any tests. See lines 132-138 at https://coverage.nodejs.org/coverage-06e1b0386196f8f8/root/internal/util.js.html

If you want insight into how to test an internal function, check out test/parallel/test-util-promisify.js which tests the promisify function within the same file.

@mithunsasidharan mithunsasidharan deleted the arrow_func branch December 8, 2017 18:02
@mithunsasidharan
Copy link
Contributor Author

@apapirovski : Above certainly helps. Will get started with it next.

As for the PR merge reminder ping, my apologies again. I live in India and so I see most of the PRs get reviewed my late night time and hence sometimes its difficult to follow up instantly for rebase or replying to any review comment. Sorry again for that.

Thanks much !

@apapirovski
Copy link
Member

apapirovski commented Dec 8, 2017

Btw here's the PR that introduced that function: #16497

It's likely that a couple of PRs could be squeezed out of it:

  1. We need to create some tests for it (as per above)
  2. We could make http2 & ESM use this new function instead of calling process.emitWarning directly.

I would start with 1 as we definitely need tests. Re: 2, that will also depend on other Collaborators so I can't guarantee that there will be agreement on using it.

@mithunsasidharan
Copy link
Contributor Author

mithunsasidharan commented Dec 8, 2017

@apapirovski : Got you. I'll get started with 1 for now ! As for 2, how do we generally discuss and get a consensus on such thoughts ? Is it by opening an issue ? For now, I'll focus on 1. Thanks.

@apapirovski
Copy link
Member

In this case, the best option would be to just do some work on 2 and then open a PR. I don't think it's going to be too contentious. We might need a few rounds of changes to potentially improve emitExperimentalWarning but I think that change would be pretty likely to ultimately land.

For controversial PRs, best course of action is to open an issue with a description of the problem and outlining the proposed solution.

@mithunsasidharan
Copy link
Contributor Author

@apapirovski : That helps. So let me get started with 1 and complete before I start with 2. Thanks !

@mithunsasidharan
Copy link
Contributor Author

mithunsasidharan commented Dec 12, 2017

@apapirovski : Had a question. I see promisify being exported to lib/util.js(https://github.com/nodejs/node/blob/master/lib/util.js#L61) is what is being used for in tests (https://github.com/nodejs/node/blob/master/test/parallel/test-util-promisify.js#L7). But similar to it, I don't see case emitExperimentalWarning being referenced in lib/util.js ! So shall I add that change while writing test and then require emitExperimentalWarning from lib/util.js instead of internal/util. Kindly help. Thanks.

@apapirovski
Copy link
Member

Note the // Flags: --expose-internals comment at the top of that test file, as well as const { customPromisifyArgs } = require('internal/util');. You would need to do something similar when writing a test for emitExperimentalWarning.

@mithunsasidharan
Copy link
Contributor Author

@apapirovski : Thanks. Since, I'm into writing UT in node for first time there is slight confusion. So I've come up with a scenario to begin with which works fine as below :

'use strict';
// Flags: --expose-internals
const common = require('../common');
const assert = require('assert');
const { experimentalWarnings } = require('internal/util');
const { emitExperimentalWarning } = require('internal/util');

assert.ok(process.stdout.writable);
assert.ok(process.stderr.writable);


common.crashOnUnhandledRejection();

assert.doesNotThrow(function() {
  process.once('warning', common.mustCall((warning) => {
    assert(/is an experimental feature/.test(warning.message));
  }));
  emitExperimentalWarning('feature2');
});

But I'm not quite sure how to set value to experimentalWarnings (the node standard way) and call emitExperimentalWarning with different args against it!

@apapirovski
Copy link
Member

apapirovski commented Dec 12, 2017

common.crashOnUnhandledRejection(); shouldn't be necessary.

The most sensible way to do a full test for emitExperimentalWarning would be to:

  • add a feature to emitExperimentalWarning and confirm it logs using process.on('warning') (not once) with common.mustCall(fn, 2) (confirms that the fn runs twice)
  • add a feature to emitExperimentalWarning again, it shouldn't log anything this time
  • add a feature2, it should log (this will satisfy the common.mustCall(fn, 2) above)

@mithunsasidharan
Copy link
Contributor Author

mithunsasidharan commented Dec 12, 2017

@apapirovski : Thanks. That makes sense and I hope this is what it gets translated to as below:

'use strict';
// Flags: --expose-internals
const common = require('../common');
const assert = require('assert');
const { emitExperimentalWarning } = require('internal/util');

assert.ok(process.stdout.writable);
assert.ok(process.stderr.writable);
// Support legacy API
assert.strictEqual(typeof process.stdout.fd, 'number');
assert.strictEqual(typeof process.stderr.fd, 'number');


assert.doesNotThrow(function() {
  process.once('warning', common.mustCall((warning) => {
    assert(/is an experimental feature/.test(warning.message));
  }), 2);
  emitExperimentalWarning('feature1');
  emitExperimentalWarning('feature1');
  emitExperimentalWarning('feature2');
});

I've added the tests in new file test-util-emit-experimental-warning.js as per naming standard. If you feel it looks good overall, I can go ahead with the PR. Kindly share your feedback. Thanks!

@apapirovski
Copy link
Member

I think the 2 is in the wrong spot. The following seems unnecessary:

assert.ok(process.stdout.writable);
assert.ok(process.stderr.writable);
// Support legacy API
assert.strictEqual(typeof process.stdout.fd, 'number');
assert.strictEqual(typeof process.stderr.fd, 'number');

Also, there's no need for doesNotThrow around the test. Also, const { experimentalWarnings } = require('internal/util'); can be removed.

@mithunsasidharan
Copy link
Contributor Author

@apapirovski :

'use strict';
// Flags: --expose-internals
const common = require('../common');
const assert = require('assert');
const { emitExperimentalWarning } = require('internal/util');

assert.doesNotThrow(function() {
  process.once('warning', common.mustCall((warning) => {
    assert(/is an experimental feature/.test(warning.message));
  }));
  emitExperimentalWarning('feature1');
  emitExperimentalWarning('feature1');
  emitExperimentalWarning('feature2');
},2);

@apapirovski
Copy link
Member

apapirovski commented Dec 12, 2017

assert.doesNotThrow isn't necessary. You need the 2, it was just in the wrong spot before. The rest seems good. Maybe include a comment re: what's being tested.

@mithunsasidharan
Copy link
Contributor Author

@apapirovski thanks... not quite sure if process.once fits there since my test fails.. can you help correct me with that ?

'use strict';
// Flags: --expose-internals
const common = require('../common');
const assert = require('assert');
const { emitExperimentalWarning } = require('internal/util');

process.once('warning', common.mustCall((warning) => {
  assert(/is an experimental feature/.test(warning.message));
}, 2));
emitExperimentalWarning('feature1');
emitExperimentalWarning('feature1');
emitExperimentalWarning('feature2');

Above seems to be failing. Not quite able to get this in place!

@mithunsasidharan
Copy link
Contributor Author

mithunsasidharan commented Dec 12, 2017

@apapirovski process.on since its 2 😄

'use strict';
// Flags: --expose-internals
const common = require('../common');
const assert = require('assert');
const { emitExperimentalWarning } = require('internal/util');

process.on('warning', common.mustCall((warning) => {
  assert(/is an experimental feature/.test(warning.message));
}, 2));
emitExperimentalWarning('feature1');
emitExperimentalWarning('feature1');
emitExperimentalWarning('feature2');

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
test Issues and PRs related to the tests.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants