Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs(consensus): update decentralization guide #3343

Merged
merged 3 commits into from
Dec 5, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 4 additions & 5 deletions core/bin/snapshots_creator/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,9 +43,8 @@ repository root. The storage location can be configured using the object store c
filesystem, or Google Cloud Storage (GCS). Beware that for end-to-end testing of snapshot recovery, changes applied to
the main node configuration must be reflected in the external node configuration.

Creating a snapshot is a part of the [snapshot recovery integration test]. You can run the test using
`yarn recovery-test snapshot-recovery-test`. It requires the main node to be launched with a command like
`zk server --components api,tree,eth,state_keeper,commitment_generator`.
Creating a snapshot is a part of the [snapshot recovery integration test]. You can run the test using `yarn recovery-test snapshot-recovery-test`.
It requires the main node to be launched with a command like `zk server --components api,tree,eth,state_keeper,commitment_generator`.

## Snapshots format

Expand All @@ -59,8 +58,8 @@ Each snapshot consists of three types of data (see [`snapshots.rs`] for exact de
enumeration index; both are used to restore the contents of the `initial_writes` table. Chunking storage logs is
motivated by their parallel generation; each chunk corresponds to a distinct non-overlapping range of hashed storage
keys. (This should be considered an implementation detail for the purposes of snapshot recovery; recovery must not
rely on any particular key distribution among chunks.) Stored as gzipped Protobuf messages in an [object store]; each
chunk is a separate object.
rely on any particular key distribution among chunks.) Stored as gzipped Protobuf messages in an [object store]; each chunk
is a separate object.
- **Factory dependencies:** All bytecodes deployed on L2 at the time the snapshot is made. Stored as a single gzipped
Protobuf message in an object store.

Expand Down
6 changes: 3 additions & 3 deletions core/lib/merkle_tree/README.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
# Merkle Tree

Binary Merkle tree implementation based on amortized radix-16 Merkle tree (AR16MT) described in the [Jellyfish Merkle
tree] white paper. Unlike Jellyfish Merkle tree, our construction uses vanilla binary tree hashing algorithm to make it
easier for the circuit creation. The depth of the tree is 256, and Blake2 is used as the hashing function.
Binary Merkle tree implementation based on amortized radix-16 Merkle tree (AR16MT) described in the [Jellyfish
Merkle tree] white paper. Unlike Jellyfish Merkle tree, our construction uses vanilla binary tree hashing algorithm to
make it easier for the circuit creation. The depth of the tree is 256, and Blake2 is used as the hashing function.

## Snapshot tests

Expand Down
5 changes: 4 additions & 1 deletion core/tests/recovery-test/src/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -193,7 +193,10 @@ export class NodeProcess {
return new NodeProcess(childProcess, logs);
}

private constructor(private childProcess: ChildProcess, readonly logs: FileHandle) {}
private constructor(
private childProcess: ChildProcess,
readonly logs: FileHandle
) {}

exitCode() {
return this.childProcess.exitCode;
Expand Down
5 changes: 4 additions & 1 deletion core/tests/ts-integration/src/l1-provider.ts
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,10 @@ class L1TransactionResponse extends ethers.TransactionResponse implements Augmen
private isWaitingReported: boolean = false;
private isReceiptReported: boolean = false;

constructor(base: ethers.TransactionResponse, public readonly reporter: Reporter) {
constructor(
base: ethers.TransactionResponse,
public readonly reporter: Reporter
) {
super(base, base.provider);
}

Expand Down
5 changes: 4 additions & 1 deletion core/tests/ts-integration/src/retry-provider.ts
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,10 @@ class L2TransactionResponse extends zksync.types.TransactionResponse implements
private isWaitingReported: boolean = false;
private isReceiptReported: boolean = false;

constructor(base: zksync.types.TransactionResponse, public readonly reporter: Reporter) {
constructor(
base: zksync.types.TransactionResponse,
public readonly reporter: Reporter
) {
super(base, base.provider);
}

Expand Down
6 changes: 5 additions & 1 deletion core/tests/ts-integration/src/utils.ts
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,11 @@ export enum NodeType {
}

export class Node<TYPE extends NodeType> {
constructor(public proc: ChildProcessWithoutNullStreams, public l2NodeUrl: string, private readonly type: TYPE) {}
constructor(
public proc: ChildProcessWithoutNullStreams,
public l2NodeUrl: string,
private readonly type: TYPE
) {}

public async terminate() {
try {
Expand Down
5 changes: 4 additions & 1 deletion core/tests/ts-integration/tests/api/web3.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -1232,7 +1232,10 @@ export class MockMetamask {
readonly isMetaMask: boolean = true;
readonly chainId: string;

constructor(readonly wallet: zksync.Wallet, readonly networkVersion: bigint) {
constructor(
readonly wallet: zksync.Wallet,
readonly networkVersion: bigint
) {
this.chainId = ethers.toBeHex(networkVersion);
}

Expand Down
8 changes: 5 additions & 3 deletions core/tests/ts-integration/tests/system.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -371,9 +371,11 @@ describe('System behavior checks', () => {
function bootloaderUtilsContract() {
const BOOTLOADER_UTILS_ADDRESS = '0x000000000000000000000000000000000000800c';
const BOOTLOADER_UTILS = new ethers.Interface(
require(`${
testMaster.environment().pathToHome
}/contracts/system-contracts/zkout/BootloaderUtilities.sol/BootloaderUtilities.json`).abi
require(
`${
testMaster.environment().pathToHome
}/contracts/system-contracts/zkout/BootloaderUtilities.sol/BootloaderUtilities.json`
).abi
);

return new ethers.Contract(BOOTLOADER_UTILS_ADDRESS, BOOTLOADER_UTILS, alice);
Expand Down
28 changes: 18 additions & 10 deletions core/tests/upgrade-test/tests/utils.ts
Original file line number Diff line number Diff line change
Expand Up @@ -93,11 +93,13 @@ export function initContracts(pathToHome: string, zkStack: boolean): Contracts {
complexUpgraderAbi: new ethers.Interface(
require(`${CONTRACTS_FOLDER}/system-contracts/zkout/ComplexUpgrader.sol/ComplexUpgrader.json`).abi
),
counterBytecode:
require(`${pathToHome}/core/tests/ts-integration/artifacts-zk/contracts/counter/counter.sol/Counter.json`)
.deployedBytecode,
counterBytecode: require(
`${pathToHome}/core/tests/ts-integration/artifacts-zk/contracts/counter/counter.sol/Counter.json`
).deployedBytecode,
stateTransitonManager: new ethers.Interface(
require(`${CONTRACTS_FOLDER}/l1-contracts/out/StateTransitionManager.sol/StateTransitionManager.json`).abi
require(
`${CONTRACTS_FOLDER}/l1-contracts/out/StateTransitionManager.sol/StateTransitionManager.json`
).abi
)
};
} else {
Expand All @@ -116,16 +118,22 @@ export function initContracts(pathToHome: string, zkStack: boolean): Contracts {
require(`${L1_CONTRACTS_FOLDER}/governance/ChainAdmin.sol/ChainAdmin.json`).abi
),
l2ForceDeployUpgraderAbi: new ethers.Interface(
require(`${pathToHome}/contracts/l2-contracts/artifacts-zk/contracts/ForceDeployUpgrader.sol/ForceDeployUpgrader.json`).abi
require(
`${pathToHome}/contracts/l2-contracts/artifacts-zk/contracts/ForceDeployUpgrader.sol/ForceDeployUpgrader.json`
).abi
),
complexUpgraderAbi: new ethers.Interface(
require(`${pathToHome}/contracts/system-contracts/artifacts-zk/contracts-preprocessed/ComplexUpgrader.sol/ComplexUpgrader.json`).abi
require(
`${pathToHome}/contracts/system-contracts/artifacts-zk/contracts-preprocessed/ComplexUpgrader.sol/ComplexUpgrader.json`
).abi
),
counterBytecode:
require(`${pathToHome}/core/tests/ts-integration/artifacts-zk/contracts/counter/counter.sol/Counter.json`)
.deployedBytecode,
counterBytecode: require(
`${pathToHome}/core/tests/ts-integration/artifacts-zk/contracts/counter/counter.sol/Counter.json`
).deployedBytecode,
stateTransitonManager: new ethers.Interface(
require(`${L1_CONTRACTS_FOLDER}/state-transition/StateTransitionManager.sol/StateTransitionManager.json`).abi
require(
`${L1_CONTRACTS_FOLDER}/state-transition/StateTransitionManager.sol/StateTransitionManager.json`
).abi
)
};
}
Expand Down
16 changes: 8 additions & 8 deletions docs/src/guides/advanced/13_zk_intuition.md
Original file line number Diff line number Diff line change
Expand Up @@ -85,8 +85,8 @@ located in a module [zksync core witness]. However, for the new proof system, th
new location called [separate witness binary].

Inside this new location, after the necessary data is fetched from storage, the witness generator calls another piece of
code from [zkevm_test_harness witness] named `run_with_fixed_params`. This code is responsible for creating the
witnesses themselves (which can get really HUGE).
code from [zkevm_test_harness witness] named `run_with_fixed_params`. This code is responsible for creating the witnesses
themselves (which can get really HUGE).

## Generating the Proof

Expand All @@ -96,9 +96,9 @@ The main goal of this step is to take an operation (for example, a calculation c
into smaller pieces. Then, we represent this information as a special mathematical expression called a polynomial.

To construct these polynomials, we use something called a `ConstraintSystem`. The specific type that we use is called
zkSNARK, and our custom version of it is named bellman. You can find our code for this in the [bellman repo].
Additionally, we have an optimized version that's designed to run faster on certain types of hardware (using CUDA
technology), which you can find in the [bellman cuda repo].
zkSNARK, and our custom version of it is named bellman. You can find our code for this in the [bellman repo]. Additionally,
we have an optimized version that's designed to run faster on certain types of hardware (using CUDA technology), which you
can find in the [bellman cuda repo].

An [example ecrecover circuit] might give you a clearer picture of what this looks like in practice.

Expand All @@ -107,9 +107,9 @@ heavy calculations, we use GPUs to speed things up.

### Where is the Code

The main code that utilizes the GPUs to create proofs is located in a repository named [heavy_ops_service repo]. This
code combines elements from the [bellman cuda repo] that we mentioned earlier, along with a huge amount of data
generated by the witness, to produce the final proofs.
The main code that utilizes the GPUs to create proofs is located in a repository named [heavy_ops_service repo]. This code
combines elements from the [bellman cuda repo] that we mentioned earlier, along with a huge amount of data generated by the
witness, to produce the final proofs.

## What Does "Verify Proof on L1" Mean

Expand Down
Loading
Loading