Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fuzz test wrappers #76

Closed
3 tasks
zepumph opened this issue Aug 6, 2019 · 10 comments
Closed
3 tasks

Fuzz test wrappers #76

zepumph opened this issue Aug 6, 2019 · 10 comments

Comments

@zepumph
Copy link
Member

zepumph commented Aug 6, 2019

From https://github.com/phetsims/phet-io-wrapper-sonification/issues/84 and other discussions while creating tests for PhET-iO wrappers, it would be nice to have a list of wrappers and have the sim fuzz in all of them via local aqua tests and on CT.

I'm thinking that each sim's list of testable wrappers would be the list of wrapper suite wrappers (chipper/data/wrappers) as well as the supplemental list in package.json created in phetsims/chipper#785.

I'm thinking that in phetmarks it would be ideal to have the following tests:

  • In each sim a spot for a local test server esque fuzz testing list of all wrappers for that sim.
  • Some sort of composite fuzz test in aqua which fuzzes all sims and wrappers.
  • Run fuzzing of all wrappers on CT.
@zepumph zepumph changed the title Fuzz wrappers Fuzz test wrappers Aug 6, 2019
@samreid
Copy link
Member

samreid commented Aug 21, 2019

We discussed this today. As we are moving toward publishing and sharing more dev versions, we would like more confidence that things are working smoothly without having to rely on the QA team so heavily. We decided to increase the priority.

@samreid
Copy link
Member

samreid commented Dec 28, 2019

It doesn't feel like aqua is properly modularized to easily support this. What if every repo had a single entry point HTML that ran all of the tests for that repo (or one entry point for requirejs and one for built)? Or it could define its own testable entry points in package.json. Then the logic in aqua would be:

  • wait for stable shas
  • iterate over every repo
  • run its tests

And we wouldn't need to take CT offline or restart it to add/remove/change tests. We could also hook in to the same tests from phetmarks. In #69 @jonathanolson and I discussed getting CT to run locally, this could help in that effort.

On the other hand, wouldn't this end up duplicating a lot of boilerplate between repos? Maybe that can be factored out?

@samreid
Copy link
Member

samreid commented Dec 28, 2019

Also, this code should be running wrapper unit tests, such as state wrapper fuzz:

// phet-io wrappers tests for each PhET-iO Sim
snapshot.phetioRepos.forEach( function( phetioRepo ) {
snapshot.testQueue.push( {
count: 0,
snapshotName: snapshotName,
test: [ phetioRepo, 'phet-io-tests', 'no-assert' ],
// Use the QUnit harness since errors are reported to QUnit
url: 'qunit-test.html?url=' + encodeURIComponent( '../../' + snapshotName + '/phet-io-wrappers/phet-io-wrappers-tests.html?sim=' + phetioRepo )
} );
snapshot.testQueue.push( {
count: 0,
snapshotName: snapshotName,
test: [ phetioRepo, 'phet-io-tests', 'assert' ],
// Use the QUnit harness since errors are reported to QUnit
url: 'qunit-test.html?url=' + encodeURIComponent( '../../' + snapshotName + '/phet-io-wrappers/phet-io-wrappers-tests.html?sim=' + phetioRepo + '&phetioDebug' )
} );
} );

@samreid
Copy link
Member

samreid commented Jan 23, 2020

Note that in https://github.com/phetsims/phet-io-wrappers/issues/292 we dropped the ability to fuzz test studio as part of the qunit test. It would be good to recover that test.

zepumph added a commit that referenced this issue Feb 14, 2020
@zepumph
Copy link
Member Author

zepumph commented Feb 15, 2020

Today @samreid and I added wrapper fuzzing to continuous testing. We did this by adding postMessage callbacks within Client.js. Immediately this is helping out with errors. We have found https://github.com/phetsims/phet-io/issues/1620 and https://github.com/phetsims/studio/issues/100 and https://github.com/phetsims/studio/issues/101 reporting in CT.

samreid added a commit that referenced this issue Feb 15, 2020
@zepumph
Copy link
Member Author

zepumph commented Feb 17, 2020

@samreid I think that https://github.com/phetsims/phet-io-wrappers/commit/53953f910a6a530c01e50b0bb447fb6a5183d2c6 will fix many CT errors that look like phetioID not found in studio. Please review.

UPDATE: To explain the issue a bit more . . . Because of the async nature of wrappers, studio isn't smart enough to know if a dynamic object still exists when studio sends a message to link to one of the dynamic element's sub Properties. As a result we created isFuzzSpecificError.js in studio and this solution has worked well for us thus far. The reason why these errors are being triggered again is because of the way we set up error reporting for our wrapper fuzz tests on CT. Basically isFuzzSpecificError was being ignored, and the sim error: phetioID not found was being forwarded directly to the parent testing frame, unfiltered. In the above commit I made sure that only errors that are thrown in the wrapper get propagated to the testing parent frame.

@samreid
Copy link
Member

samreid commented Feb 18, 2020

The change seems appropriate, thanks!

@samreid
Copy link
Member

samreid commented Mar 6, 2020

I think this issue is complete and can be closed. Problems identified in the wrapper fuzzing will be opened in new issues. @zepumph what say you?

@samreid samreid removed their assignment Mar 6, 2020
@zepumph
Copy link
Member Author

zepumph commented Mar 16, 2020

Right now we are individually fuzzing studio, state, and mirror inputs wrappers. Are there any others that we would like to add? If not please close.

@zepumph zepumph assigned samreid and unassigned zepumph Mar 16, 2020
@samreid
Copy link
Member

samreid commented Mar 16, 2020

That seems great, thanks! Closing.

@samreid samreid closed this as completed Mar 16, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants