-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat[integration-tests]: add verifier sync test #913
feat[integration-tests]: add verifier sync test #913
Conversation
🦋 Changeset detectedLatest commit: 473f525 The changes in this PR will be included in the next version bump. This PR includes changesets to release 1 package
Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
Codecov Report
@@ Coverage Diff @@
## develop #913 +/- ##
========================================
Coverage 82.21% 82.21%
========================================
Files 48 48
Lines 1895 1895
Branches 303 303
========================================
Hits 1558 1558
Misses 337 337 Continue to review full report at Codecov.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@annieke instead of creating a separate file, I recommend putting it in the same file and setting the replicas: 0
setting, similar to how we do with the integration_tests
image: https://github.com/ethereum-optimism/optimism/blob/develop/ops/docker-compose.yml#L134-L135.
Then, you could bring it up with docker-compose up --scale verifier=1
.
e4a6a0e
to
59b368f
Compare
more context for the mismatched stateroot: i'm manually comparing the two blocks and they contain the same transaction, which makes me think this is a legitimate mismatch. thoughts @karlfloersch ? sequencer block with transaction{
"jsonrpc":"2.0",
"id":1,
"result":{
"difficulty":"0x2",
"extraData":"0xd98301090a846765746889676f312e31342e3135856c696e757800000000000057bb4fd04e3304ec3d3e3001d25a1cef856d3077e27f62685a29a1f8729033b42220f9ef93a16984c1d50b6f0fff37d1db0205afe687085254730142239617a800",
"gasLimit":"0x895440",
"gasUsed":"0x8e1e1",
"hash":"0xdea8a1bb9f6d3798b629981e07856e0b25ce7fb5e20b7e9a75f3361e21deac43",
"logsBloom":"0x00000000000000000000000000000000000800000000000000040000000000000000000000000000000000100000000000000000000000000000000000000000000000000000000000000008000004000000000010000000000000000000000400000040000000000000000100008000000000800000000000000010000000000000000000000000000000000000000000000000002000000000000200000000000000000000000000000000000000000000000000000000000000000000000000000042000000200000000000000000000000002000000000000000000000000000000000000000000000000000000000000000000000000000000000000000",
"miner":"0x0000000000000000000000000000000000000000",
"mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000",
"nonce":"0x0000000000000000",
"number":"0x3",
"parentHash":"0xd208c21184a9f7c1d38f76bf375831f0b2433d85c6bc451d70fe5f567cd86204",
"receiptsRoot":"0x6ac57aa4c7e87e5761883bc21497d77e21a838ab12760c7b5257b6b171eb9f63",
"sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347",
"size":"0x30e",
"stateRoot":"0xb27fa216bda5d3a04fc0e0297409aeef40b52811523ccd78261a1fed740bb08d",
"timestamp":"0x60ac4d24",
"totalDifficulty":"0x7",
"transactions":[
{
"blockHash":"0xdea8a1bb9f6d3798b629981e07856e0b25ce7fb5e20b7e9a75f3361e21deac43",
"blockNumber":"0x3",
"from":"0xf39fd6e51aad88f6f4ce6ab8827279cfffb92266",
"gas":"0x5208",
"gasPrice":"0x3b9aca00",
"hash":"0x156dee9dc95d323d0b22c0f175fdc9b4a059a371c058a4ffcdbe2d359501a2b4",
"input":"0xa9059cbb000000000000000000000000050acfa1e62f6fb0e6de77246a5b41aa41d4a1e70000000000000000000000000000000000000000000000000000000000000064",
"nonce":"0x1",
"to":"0x5fbdb2315678afecb367f032d93f642f64180aa3",
"transactionIndex":"0x0",
"value":"0x0",
"v":"0x36c",
"r":"0x871a64832d904e716b84c9a530651a908ad69d0534ed51bc3a882b95fb91c223",
"s":"0x3923ddee29d88351394951d2a342ff50de48b4dd9e67ce79689c20a26aff8478",
"queueOrigin":"sequencer",
"txType":"EIP155",
"l1TxOrigin":null,
"l1BlockNumber":"0x2b",
"l1Timestamp":"0x60ac4d24",
"index":"0x2",
"queueIndex":null,
"rawTransaction":"0xf8aa01843b9aca00825208945fbdb2315678afecb367f032d93f642f64180aa380b844a9059cbb000000000000000000000000050acfa1e62f6fb0e6de77246a5b41aa41d4a1e7000000000000000000000000000000000000000000000000000000000000006482036ca0871a64832d904e716b84c9a530651a908ad69d0534ed51bc3a882b95fb91c223a03923ddee29d88351394951d2a342ff50de48b4dd9e67ce79689c20a26aff8478"
}
],
"transactionsRoot":"0xb466f6e0d651f976c7e91a0cd71812f81e1445755237002c3b1d3f0f0326505e",
"uncles":[
]
}
} verifier block with transaction{
"jsonrpc":"2.0",
"id":1,
"result":{
"difficulty":"0x2",
"extraData":"0xd98301090a846765746889676f312e31342e3135856c696e7578000000000000d48b533732a4b3a3e4f1d01e6fd3f9bffd2fe26a1e2fa6222faa5abf7db5460f400cebb359c6137142191ad893ca85d810d7659997dcc728eb19ad62fa8099f401",
"gasLimit":"0x895440",
"gasUsed":"0x8e1e1",
"hash":"0xbbc30e5a401df7dd8a19a4e86d262ea8714bffebd255b83bd4a3386d99a1c658",
"logsBloom":"0x00000000000000000000000000000000000800000000000000040000000000000000000000000000000000100000000000000000000000000000000000000000000000000000000000000008000004000000000010000000000000000000000400000040000000000000000100008000000000800000000000000010000000000000000000000000000000000000000000000000002000000000000200000000000000000000000000000000000000000000000000000000000000000000000000000042000000200000000000000000000000002000000000000000000000000000000000000000000000000000000000000000000000000000000000000000",
"miner":"0x0000000000000000000000000000000000000000",
"mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000",
"nonce":"0x0000000000000000",
"number":"0x3",
"parentHash":"0x65b172d3cde34eb0abb1462d4ef35b7db2771cc76c10c97d3c84fc593fd12b32",
"receiptsRoot":"0x6ac57aa4c7e87e5761883bc21497d77e21a838ab12760c7b5257b6b171eb9f63",
"sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347",
"size":"0x30e",
"stateRoot":"0x707d07837867afb72216bbc9cb4da9c34cf00f7d6be3443c456253b3304fe8d1",
"timestamp":"0x60ac4d24",
"totalDifficulty":"0x7",
"transactions":[
{
"blockHash":"0xbbc30e5a401df7dd8a19a4e86d262ea8714bffebd255b83bd4a3386d99a1c658",
"blockNumber":"0x3",
"from":"0x517ab03785a2cc5379429d537c203f69b97b7630",
"gas":"0x5208",
"gasPrice":"0x3b9aca00",
"hash":"0x6098ac748b225c61bc98e85cbe75429c7d73b81dd656e5831eb19958bea78481",
"input":"0xa9059cbb000000000000000000000000050acfa1e62f6fb0e6de77246a5b41aa41d4a1e70000000000000000000000000000000000000000000000000000000000000064",
"nonce":"0x1",
"to":"0x5fbdb2315678afecb367f032d93f642f64180aa3",
"transactionIndex":"0x0",
"value":"0x0",
"v":"0x3d7",
"r":"0x871a64832d904e716b84c9a530651a908ad69d0534ed51bc3a882b95fb91c223",
"s":"0x3923ddee29d88351394951d2a342ff50de48b4dd9e67ce79689c20a26aff8478",
"queueOrigin":"sequencer",
"txType":"EIP155",
"l1TxOrigin":null,
"l1BlockNumber":"0x2b",
"l1Timestamp":"0x60ac4d24",
"index":"0x2",
"queueIndex":null,
"rawTransaction":"0xf8aa01843b9aca00825208945fbdb2315678afecb367f032d93f642f64180aa380b844a9059cbb000000000000000000000000050acfa1e62f6fb0e6de77246a5b41aa41d4a1e7000000000000000000000000000000000000000000000000000000000000006482036ca0871a64832d904e716b84c9a530651a908ad69d0534ed51bc3a882b95fb91c223a03923ddee29d88351394951d2a342ff50de48b4dd9e67ce79689c20a26aff8478"
}
],
"transactionsRoot":"0x48e9415c1e084c98f091a9d37e79a83c41206b5d734f02545ae9a88887933807",
"uncles":[
]
}
} |
59b368f
to
ef72978
Compare
await verifier.rm() | ||
}) | ||
|
||
it('should sync dummy transaction', async () => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this test currently passes!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This test halts for me at waiting for new verifier blocks, current: 1 latest 2
when run with it.only
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@gakonst did you quit a previous test or had a verifier running locally before you started? i just remembered i ran into that before i removed the verifier image after every test; instead of restarting to sync the verifier just sits at the previous block
258e78b
to
63fa4af
Compare
A good way to debug is to make sure that the genesis state roots match like so:
Where Another way to debug is look at the gasUsed - sometimes the gas used will greatly differ and this can give a hint as to what went wrong, ie how far the execution got relative to each other. Make sure that the |
) | ||
|
||
// Wait until verifier has caught up to the sequencer | ||
let latestVerifierBlock = (await verifierProvider.getBlock('latest')) as any |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a type in core-utils
that represents the L2 block? At one point I think there was one in the batch submitter
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There is not, we could add it in a separate PR as it warrants a bigger refactor probably.
|
||
/* Helper functions */ | ||
|
||
const waitForBatchSubmission = async (totalElementsBefore: BigNumber) => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In an ideal world, we do not spin up the batch submitter and submit the batches manually using JS. This would cut out the extra dependency of needing to have a whole other codepath and the complexity that comes with it, like this function
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Given that these are integration tests checking the relation between the sequencer's sync service, the dtl, the verifier and the batch submitter, I think it's better as is actually
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm fine with leaving as is, but I think the more controlled the test, the better. This isn't a batch-submitter test, this is a verifier test. It should test the bare minimal stuff to make sure the verifier works.
.github/workflows/sync-tests.yml
Outdated
working-directory: ./integration-tests | ||
run: | | ||
yarn | ||
yarn test:sync |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there any reason that this is a different workflow than the other integration tests? Ideally the images are built once and then used twice in parallel
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
^hm i can look into this! just know we need a fresh sequencer for this set of tests
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I had started working on that here, agreed with Mark but I think it's better to do this in another PR, in the interest of saving Annie's time
Tl;dr: You'd have 1 job which builds the services, saves the docker images to a tar and then uploads the artifacts. Then in the tests job you'd download the artifacts, load the docker images and run the tests.
'latest' | ||
)) as any | ||
|
||
const matchingVerifierBlock = (await startAndSyncVerifier( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think its best to split up starting the verifier and syncing the verifier - that will make this code more flexible longer term
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same reasoning as above^
logs = await verifier.logs() | ||
} | ||
|
||
const provider = injectL2Context(verifierProvider) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The verifier provider can be decoupled and instantiated at the very top of the file just like the sequencerProvider
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same rationale behind the "new synced verifier" per test as described in the issue! i realize this does slow tests down, but before we have a bunch of those and if they can be parallelized, i think integration tests will still be the bottleneck here. would rather have this running and gradually iterate instead of get this final in the first round
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What I am suggesting is along the lines of:
function startVerifier() {
verifier = new DockerComposeNetwork(['verifier'])
await verifier.up({ commandOptions: ['--scale', 'verifier=1'] })
// Wait for verifier to be looping
let logs = await verifier.logs()
while (!logs.out.includes('Starting Sequencer Loop')) {
await sleep(500)
logs = await verifier.logs()
}
}
function syncVerifier(provider) {
// Wait until verifier has caught up to the sequencer
let latestVerifierBlock = (await provider.getBlock('latest')) as any
while (latestVerifierBlock.number < sequencerBlockNumber) {
await sleep(500)
latestVerifierBlock = (await provider.getBlock('latest')) as any
}
return provider.getBlock(sequencerBlockNumber)
}
You can still spin up the verifier after making transactions with this abstraction. This will make the code easier to maintain longer term because the abstractions are cleaner. You no longer need to sync the verifier when bringing it up. This abstraction split will eventually need to happen as more tests are added to this test suite
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me - I cannot seem to find evidence of the tests running in CI, just making sure that the tests are running as expected
b6d0cbb
to
473f525
Compare
@tynes i'm planning to add this test to CI in a follow up PR per this suggestion #913 (review); will continue the work here #891 if that sounds good! |
Description
Still rough but marking this as ready for review since there is one passing test - would love feedback on the test structure and whether we should merge the dummy test while debugging the ERC20 test.
The structure of the new test:
This means each test takes quite long. The dummy tx test took 14.5s and the ERC20 one took 24.5s - feedback on how to speed this up welcome!
Additional context
For the failing ERC20 test, the test still fails with only the contract deployment and not the transfer.
Metadata