Skip to content

Commit

Permalink
fix: address issue with spellchecker not checking against prover work…
Browse files Browse the repository at this point in the history
…space (#855)

## What ❔
- Updates `zk spellcheck` command to explicitly run cargo-spellcheck in
`prover/`
- Updates cspell glob pattern to cover all `.md` files to ensure READMEs
are picked up
- Updates cspell config to exclude `node_modules`, `target` and
`CHANGELOGs` .md files
<!-- What are the changes this PR brings about? -->
<!-- Example: This PR adds a PR template to the repo. -->
<!-- (For bigger PRs adding more context is appreciated) -->

## Why ❔
- Given prover packages are not included in the root Cargo.toml,
`cargo-spellcheck` was ignoring them. Running it explicitly during `zk
spellcheck` ensures these files are also checked against
- To ensure all READMEs are being checked against cspell changed glob
pattern accordingly
- Given changelogs are auto-generated checking for spelling issues here
is not pragmatic
<!-- Why are these changes done? What goal do they contribute to? What
are the principles behind them? -->
<!-- Example: PR templates ensure PR reviewers, observers, and future
iterators are in context about the evolution of repos. -->

## Checklist

<!-- Check your PR fulfills the following items. -->
<!-- For draft PRs check the boxes as you complete them. -->

- [x] PR title corresponds to the body of PR (we generate changelog
entries from PRs).
- [ ] Tests for the changes have been added / updated.
- [x] Documentation comments have been added / updated.
- [x] Code has been formatted via `zk fmt` and `zk lint`.
- [x] Spellcheck has been run via `zk spellcheck`.
  • Loading branch information
dutterbutter authored Jan 10, 2024
1 parent 88fd724 commit 4f55926
Show file tree
Hide file tree
Showing 14 changed files with 44 additions and 37 deletions.
29 changes: 15 additions & 14 deletions infrastructure/zk/src/spellcheck.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,23 +2,25 @@ import { Command } from 'commander';
import * as utils from './utils';

export async function runSpellCheck(pattern: string, useCargo: boolean, useCSpell: boolean) {
const cSpellCommand = `cspell ${pattern} --config=./spellcheck/cspell.json`;
// Default commands for cSpell and cargo spellcheck
const cSpellCommand = `cspell "${pattern}" --config=./spellcheck/cspell.json`;
const cargoCommand = `cargo spellcheck --cfg=./spellcheck/era.cfg`;
// Necessary to run cargo spellcheck in the prover directory explicitly as
// it is not included in the root cargo.toml file
const cargoCommandForProver = `cargo spellcheck --cfg=../spellcheck/era.cfg`;

try {
let results = [];
if (!useCargo && !useCSpell) {
results = await Promise.all([utils.spawn(cSpellCommand), utils.spawn(cargoCommand)]);
} else {
// Run cspell if specified
if (useCSpell) {
results.push(await utils.spawn(cSpellCommand));
}

// Run cargo spellcheck if specified
if (useCargo) {
results.push(await utils.spawn(cargoCommand));
}
// Run cspell over all **/*.md files
if (useCSpell || (!useCargo && !useCSpell)) {
results.push(await utils.spawn(cSpellCommand));
}

// Run cargo spellcheck in core and prover directories
if (useCargo || (!useCargo && !useCSpell)) {
results.push(await utils.spawn(cargoCommand));
results.push(await utils.spawn('cd prover && ' + cargoCommandForProver));
}

// Check results and exit with error code if any command failed
Expand All @@ -33,8 +35,7 @@ export async function runSpellCheck(pattern: string, useCargo: boolean, useCSpel
}

export const command = new Command('spellcheck')
.option('--pattern <pattern>', 'Glob pattern for files to check', 'docs/**/*')
.option('--config <config>', 'Path to configuration file', './spellcheck/cspell.json')
.option('--pattern <pattern>', 'Glob pattern for files to check', '**/*.md')
.option('--use-cargo', 'Use cargo spellcheck')
.option('--use-cspell', 'Use cspell')
.description('Run spell check on specified files')
Expand Down
10 changes: 5 additions & 5 deletions prover/prover_fri/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ from the database and do their part of the pipeline.
```mermaid
flowchart LR
A["Operator"] --> |Produces block| F[Prover Gateway]
F --> |Inserts into DB| B["Postgress DB"]
F --> |Inserts into DB| B["Postgres DB"]
B --> |Retrieves proven block \nafter compression| F
B --> C["Witness"]
C --- C1["Basic Circuits"]
Expand Down Expand Up @@ -108,8 +108,8 @@ Machine specs:
API_PROMETHEUS_LISTENER_PORT=3116 zk f cargo run --release --bin zksync_witness_generator -- --all_rounds
```

Note that this will automatically open the three ports after the one specified in enviromental variable, in this case
3117, 3118 and 3119.
Note that this will automatically open the three ports after the one specified in environmental variable, in this
case 3117, 3118 and 3119.

7. Run prover to perform actual proving:

Expand Down Expand Up @@ -201,7 +201,7 @@ zk status prover
```

This might take a while (around an hour and a half on my machine using the CPU prover), you can check on it once in a
while. A succesful flow should output something like
while. A successful flow should output something like

```
==== FRI Prover status ====
Expand All @@ -211,7 +211,7 @@ Verification key hash on contract is 0x4be443afd605a782b6e56d199df2460a025c81b3d
Verification key in database is 0x4be443afd605a782b6e56d199df2460a025c81b3dea144e135bece83612563f2
Verifier hash matches.
Verifier params on contract are 0x5a3ef282b21e12fe1f4438e5bb158fc5060b160559c5158c6389d62d9fe3d080, 0x72167c43a46cf38875b267d67716edc4563861364a3c03ab7aee73498421e828, 0x0000000000000000000000000000000000000000000000000000000000000000
Verifcation params match.
Verification params match.
Next block that should be verified is: 2
Checking status of the proofs...
Proof progress for 1 : 111 successful, 0 failed, 0 in progress, 0 queued. Compression job status: successful
Expand Down
2 changes: 1 addition & 1 deletion prover/prover_fri/src/socket_listener.rs
Original file line number Diff line number Diff line change
Expand Up @@ -136,7 +136,7 @@ pub mod gpu_socket_listener {
witness_vector_artifacts: witness_vector,
assembly,
};
// acquiring lock from queue and updating db must be done atomically otherwise it results in TOCTTOU
// acquiring lock from queue and updating db must be done atomically otherwise it results in `TOCTTOU`
// Time-of-Check to Time-of-Use
let mut queue = self.queue.lock().await;

Expand Down
2 changes: 1 addition & 1 deletion prover/prover_fri/src/utils.rs
Original file line number Diff line number Diff line change
Expand Up @@ -192,7 +192,7 @@ mod tests {

let result = get_setup_data_key(key);

// Check if the circuit_id has been changed to NodeLayerCircuit's id
// Check if the `circuit_id` has been changed to `NodeLayerCircuit's` id
assert_eq!(expected, result);
}

Expand Down
2 changes: 1 addition & 1 deletion prover/vk_setup_data_generator_server_fri/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -372,7 +372,7 @@ pub fn get_finalization_hints(
key: ProverServiceDataKey,
) -> anyhow::Result<FinalizationHintsForProver> {
let mut key = key;
// For NodeAggregation round we have only 1 finalization hints for all circuit type.
// For `NodeAggregation` round we have only 1 finalization hints for all circuit type.
if key.round == AggregationRound::NodeAggregation {
key.circuit_id = ZkSyncRecursionLayerStorageType::NodeLayerCircuit as u8;
}
Expand Down
8 changes: 4 additions & 4 deletions prover/vk_setup_data_generator_server_fri/src/tests.rs
Original file line number Diff line number Diff line change
Expand Up @@ -65,22 +65,22 @@ proptest! {

}

// Test get_base_path method
// Test `get_base_path` method
#[test]
fn test_get_base_path() {
let base_path = get_base_path();
assert!(!base_path.is_empty(), "Base path should not be empty");
}

// Test get_file_path method
// Test `get_file_path` method
#[test]
fn test_get_file_path() {
let key = ProverServiceDataKey::new(1, AggregationRound::BasicCircuits);
let file_path = get_file_path(key, ProverServiceDataType::VerificationKey).unwrap();
assert!(!file_path.is_empty(), "File path should not be empty");
}

// Test ProverServiceDataKey::new method
// Test `ProverServiceDataKey::new` method
#[test]
fn test_proverservicedatakey_new() {
let key = ProverServiceDataKey::new(1, AggregationRound::BasicCircuits);
Expand All @@ -95,7 +95,7 @@ fn test_proverservicedatakey_new() {
);
}

// Test get_round_for_recursive_circuit_type method
// Test `get_round_for_recursive_circuit_type` method
#[test]
fn test_get_round_for_recursive_circuit_type() {
let round = get_round_for_recursive_circuit_type(
Expand Down
2 changes: 1 addition & 1 deletion prover/vk_setup_data_generator_server_fri/src/utils.rs
Original file line number Diff line number Diff line change
Expand Up @@ -296,7 +296,7 @@ fn get_circuits(

let previous_enumeration_index = tree.next_enumeration_index();
let previous_root = tree.root();
// simualate content hash
// simulate content hash

let mut hasher = Keccak256::new();
hasher.update(previous_enumeration_index.to_be_bytes());
Expand Down
2 changes: 1 addition & 1 deletion prover/witness_generator/src/basic_circuits.rs
Original file line number Diff line number Diff line change
Expand Up @@ -558,7 +558,7 @@ async fn generate_witness(
let hashes: HashSet<H256> = input
.used_bytecodes_hashes
.iter()
// SMA-1555: remove this hack once updated to the latest version of zkevm_test_harness
// SMA-1555: remove this hack once updated to the latest version of `zkevm_test_harness`
.filter(|&&hash| hash != h256_to_u256(header.base_system_contracts_hashes.bootloader))
.map(|hash| u256_to_h256(*hash))
.collect();
Expand Down
6 changes: 3 additions & 3 deletions prover/witness_generator/src/main.rs
Original file line number Diff line number Diff line change
Expand Up @@ -110,8 +110,8 @@ async fn main() -> anyhow::Result<()> {
.protocol_version_for(&vk_commitments)
.await;

// If batch_size is none, it means that the job is 'looping forever' (this is the usual setup in local network).
// At the same time, we're reading the protocol_version only once at startup - so if there is no protocol version
// If `batch_size` is none, it means that the job is 'looping forever' (this is the usual setup in local network).
// At the same time, we're reading the `protocol_version` only once at startup - so if there is no protocol version
// read (this is often due to the fact, that the gateway was started too late, and it didn't put the updated protocol
// versions into the database) - then the job will simply 'hang forever' and not pick any tasks.
if opt.batch_size.is_none() && protocol_versions.is_empty() {
Expand Down Expand Up @@ -157,7 +157,7 @@ async fn main() -> anyhow::Result<()> {
prometheus_config.push_interval(),
)
} else {
// u16 cast is safe since i is in range [0, 4)
// `u16` cast is safe since i is in range [0, 4)
PrometheusExporterConfig::pull(prometheus_config.listener_port + i as u16)
};
let prometheus_task = prometheus_config.run(stop_receiver.clone());
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -137,7 +137,7 @@ impl BinarySparseStorageTree<256, 32, 32, 8, 32, Blake2s256, ZkSyncStorageLeaf>
"insert_leaf enumeration index mismatch",
);

// reset is_get_leaf_invoked for the next get/insert invocation
// reset `is_get_leaf_invoked` for the next get/insert invocation
self.is_get_leaf_invoked = false;

// if this insert was in fact the very first insert, it should bump the `next_enumeration_index`
Expand Down Expand Up @@ -219,7 +219,7 @@ impl BinarySparseStorageTree<256, 32, 32, 8, 32, Blake2s256, ZkSyncStorageLeaf>
root: &[u8; 32],
query: &LeafQuery<256, 32, 32, 32, ZkSyncStorageLeaf>,
) -> bool {
//copied from zkevm_test_harness/src/witness/tree/mod.rs with minor changes
//copied from `zkevm_test_harness/src/witness/tree/mod.rs` with minor changes
tracing::trace!(
"invoked verify_inclusion. Index: {:?}, root: {:?})",
query.index,
Expand Down
2 changes: 1 addition & 1 deletion prover/witness_vector_generator/src/generator.rs
Original file line number Diff line number Diff line change
Expand Up @@ -245,7 +245,7 @@ async fn handle_send_result(
reason: {err}"
);

// mark prover instance in gpu_prover_queue dead
// mark prover instance in `gpu_prover_queue` dead
pool.access_storage()
.await
.unwrap()
Expand Down
2 changes: 1 addition & 1 deletion prover/witness_vector_generator/src/main.rs
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ mod metrics;
about = "Tool for generating witness vectors for circuits"
)]
struct Opt {
/// Number of times witness_vector_generator should be run.
/// Number of times `witness_vector_generator` should be run.
#[structopt(short = "n", long = "n_iterations")]
number_of_iterations: Option<usize>,
}
Expand Down
7 changes: 5 additions & 2 deletions spellcheck/cspell.json
Original file line number Diff line number Diff line change
@@ -1,11 +1,14 @@
{
"language": "en",
"ignorePaths": [
"node_modules/**",
"**/CHANGELOG.md",
"**/node_modules/**",
".github/**",
".firebase/**",
".yarn/**",
"dist/**"
"dist/**",
"**/contracts/**",
"**/target/**"
],
"dictionaries": [
"typescript",
Expand Down
3 changes: 3 additions & 0 deletions spellcheck/era.dic
Original file line number Diff line number Diff line change
Expand Up @@ -520,6 +520,9 @@ uadd
nocallback
nosync
swrite
Devs
insta
NFTF

// ETC
gitter
Expand Down

0 comments on commit 4f55926

Please sign in to comment.