Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: bb.js tests of ClientIVC #9412

Merged
merged 53 commits into from
Nov 8, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
53 commits
Select commit Hold shift + click to select a range
cab6cbb
Squash
codygunton Oct 30, 2024
da8a13a
Unfortunately this does not detect any issue
codygunton Oct 30, 2024
4a12e5a
This makes the wasm test pass
codygunton Oct 31, 2024
8f8f371
Copy to browser tests
codygunton Oct 31, 2024
637c214
Use as libary
codygunton Oct 31, 2024
51441ff
This works (but no logs still?)
codygunton Nov 3, 2024
8b55e07
Adjust logging to be more useful
codygunton Nov 4, 2024
0bf4a58
Merge remote-tracking branch 'origin/master' into cg/browser-civc
codygunton Nov 4, 2024
5368fb4
Fix compilation
codygunton Nov 4, 2024
7f89016
Reuse code and make more complex browser test
codygunton Nov 4, 2024
4d15a4f
Make button app
codygunton Nov 4, 2024
291f521
Try reverting + loader
codygunton Nov 4, 2024
982c12a
Revert-ish yarn.lock
codygunton Nov 4, 2024
8ae873a
Revert headless test
codygunton Nov 4, 2024
19eef28
Actually who cares just revert all of this
codygunton Nov 4, 2024
cf314d3
Let's just remember we can use raw-loader
codygunton Nov 4, 2024
d6785af
more cleanup
codygunton Nov 4, 2024
5fcae5f
Revert a bit more
codygunton Nov 4, 2024
762996d
Try to revert huge yarn.lock change
codygunton Nov 4, 2024
0de79bb
More diff tidying
codygunton Nov 4, 2024
ae2c36d
Make sure test still pass
codygunton Nov 4, 2024
f866895
Merge remote-tracking branch 'origin/master' into cg/browser-civc
codygunton Nov 4, 2024
d4b3ecd
Do some more yarn
codygunton Nov 4, 2024
d61dc26
Lock update..?
codygunton Nov 4, 2024
24c1fd0
Fix formatting
codygunton Nov 4, 2024
3931cc1
Again??
codygunton Nov 4, 2024
104ead0
Reinstate missing flow
codygunton Nov 4, 2024
1990e56
Holy crap
codygunton Nov 5, 2024
18982a0
Fix Grumpkin CRS dl
codygunton Nov 5, 2024
6969d3a
Revert accidental change to run_acir_tests
codygunton Nov 5, 2024
d684f74
How do I still have formatting to fix?
codygunton Nov 5, 2024
551b01c
Revert to non-asserts build
codygunton Nov 5, 2024
a974756
How yet again
codygunton Nov 5, 2024
8771a9f
Try adding this dep that didn't break the build?
codygunton Nov 5, 2024
fc9c912
Had to update lock file
codygunton Nov 5, 2024
3422e30
Delete unused (redundant with benchmarking script).
codygunton Nov 5, 2024
66ced97
Reinstate use of options
codygunton Nov 5, 2024
b4e2245
Try this to build playwright
codygunton Nov 5, 2024
1bdeec0
Try extra build step after bootstrap
codygunton Nov 5, 2024
afcbdb4
Merge branch 'master' into cg/browser-civc
codygunton Nov 5, 2024
537ebb2
Merge remote-tracking branch 'origin/master' into cg/browser-civc
codygunton Nov 5, 2024
7ed8b0c
Make it use a real browser and fix CRS issue
codygunton Nov 6, 2024
71d33ba
Local package.json
codygunton Nov 6, 2024
6426f87
Rename script
codygunton Nov 6, 2024
cd3b72c
Continue script rename & try chmod with &&
codygunton Nov 6, 2024
f7d81d9
Merge branch 'master' into cg/browser-civc
codygunton Nov 6, 2024
556247f
Update local package.json
codygunton Nov 6, 2024
f90a4b4
Merge branch 'master' into cg/browser-civc
codygunton Nov 6, 2024
54b3993
Revert "chore: more descriptive slack alert (#9739)"
ludamad Nov 6, 2024
f321564
Merge remote-tracking branch 'origin/revert-9739-ad/ci/yaml' into cg/…
codygunton Nov 6, 2024
b4bab19
Merge remote-tracking branch 'origin/master' into cg/browser-civc
codygunton Nov 7, 2024
d495f5b
Only run the WASM tests
codygunton Nov 8, 2024
13577a0
Merge remote-tracking branch 'origin/master' into cg/browser-civc
codygunton Nov 8, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
79 changes: 45 additions & 34 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -948,6 +948,7 @@ jobs:
merge-check:
runs-on: ubuntu-20.04
needs:
# must be kept in sync with rerun-check
- setup
- configure
- build
Expand Down Expand Up @@ -985,30 +986,15 @@ jobs:
- boxes-test
# - protocol-circuits-gates-report # non-blocking
if: always()
outputs:
failure: ${{ steps.set_failed_jobs.outputs.failure }}
failed_jobs: ${{ steps.set_failed_jobs.outputs.failed_jobs }}
steps:
- name: Check for Failures and Set Output
id: set_failed_jobs
env:
# Collect needed jobs
NEEDS_JOBS_JSON: ${{ toJson(needs) }}
run: |
echo "Processing failed jobs..."
failed_jobs=$(echo "$NEEDS_JOBS_JSON" | jq -r 'to_entries[] | select(.value.result == "failure") | .key' | paste -sd "," -)
echo "$failed_jobs" > .failed
echo "failure=${{contains(needs.*.result, 'failure')}}" >> $GITHUB_OUTPUT
echo "failed_jobs=$failed_jobs" >> $GITHUB_OUTPUT

- name: Report overall success (non-draft)
if: github.event.pull_request.draft == false
env:
# We treat any skipped or failing jobs as a failure for the workflow as a whole.
FAIL: ${{ contains(needs.*.result, 'failure') || contains(needs.*.result, 'cancelled') }}
run: |
if [[ $FAIL == true ]]; then
echo "Jobs failed: $(cat .failed), merging not allowed."
echo "At least one job failed (or cancelled), merging not allowed."
exit 1
else
echo "All jobs succeeded, merge allowed."
Expand All @@ -1023,44 +1009,69 @@ jobs:
permissions:
actions: write
needs:
- merge-check
# must be kept in sync with merge-check
- setup
- configure
- build
- e2e
# - bench-e2e # non-blocking
# - acir-bench # non-blocking
# - bench-summary # non-blocking
- bb-gcc
- bb-native-tests
- bb-js-test
- noir-build-acir-tests
- bb-acir-tests-bb
- bb-acir-tests-bb-ultra-plonk
- bb-acir-tests-bb-ultra-honk
- bb-acir-tests-bb-mega-honk
- bb-acir-tests-sol
- bb-acir-tests-sol-honk
- bb-acir-tests-bb-js
- noir-format
- noir-test
- noir-examples
- noir-packages-test
- noir-projects
- avm-format
- yarn-project-formatting
- yarn-project-test
- prover-client-test
- network-test
- kind-network-test
- l1-contracts-test
- docs-preview
# - bb-bench # non-blocking
- boxes
- boxes-test
# - protocol-circuits-gates-report # non-blocking
if: github.event.pull_request.draft == false && !cancelled()
steps:
- name: Check for Rerun
env:
# We treat any skipped or failing jobs as a failure for the workflow as a whole.
HAD_FAILURE: ${{ contains(needs.*.result, 'failure') }}
GH_REPO: ${{ github.repository }}
GH_TOKEN: ${{ github.token }}
run: |
if [[ ${{ needs.merge-check.outputs.failure }} == true ]] && [[ $RUN_ATTEMPT -lt 2 ]] ; then
if [[ $HAD_FAILURE == true ]] && [[ $RUN_ATTEMPT -lt 2 ]] ; then
echo "Retrying first workflow failure. This is a stop-gap until things are more stable."
gh workflow run rerun.yml -F run_id=${{ github.run_id }}
fi

# NOTE: we only notify failures after a rerun has occurred
notify:
runs-on: ubuntu-20.04
needs:
- merge-check
runs-on: ubuntu-20.04
if: github.event.pull_request.draft == false && github.ref == 'refs/heads/master' && failure() && github.run_attempt >= 2
steps:
- name: Checkout code
uses: actions/checkout@v3

- name: Get Authors of Recent Commit
id: get_authors
run: |
git fetch --depth=1 origin ${{ github.sha }}
authors=$(git log -1 --pretty=format:'%an <%ae>' ${{ github.sha }})
echo "authors=${authors}" >> $GITHUB_OUTPUT

- name: Send notification to aztec3-ci channel if workflow failed on master
uses: slackapi/[email protected]
with:
payload: |
{
"text": "Master Github Actions failure",
"url": "https://github.com/${{ github.repository }}/actions/runs/${{ github.run_id }}",
"authors": "${{ steps.get_authors.outputs.authors }}",
"failed_jobs": "${{ needs.merge-check.outputs.failed_jobs }}"
"url": "https://github.com/${{ github.repository }}/actions/runs/${{ github.run_id }}"
}
env:
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_NOTIFY_WORKFLOW_TRIGGER_URL2 }}
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_NOTIFY_WORKFLOW_TRIGGER_URL }}
44 changes: 32 additions & 12 deletions barretenberg/cpp/CMakePresets.json
Original file line number Diff line number Diff line change
Expand Up @@ -399,16 +399,6 @@
"MULTITHREADING": "OFF"
}
},
{
"name": "wasm-dbg",
"displayName": "Build for debug WASM",
"binaryDir": "build-wasm-dbg",
"description": "Build with wasi-sdk to create debug wasm",
"inherits": "wasm",
"environment": {
"CMAKE_BUILD_TYPE": "Debug"
}
},
{
"name": "wasm-threads",
"displayName": "Build for pthread enabled WASM",
Expand All @@ -422,6 +412,29 @@
"MULTITHREADING": "ON"
}
},
{
"name": "wasm-threads-dbg",
"displayName": "Build for debug WASM",
"binaryDir": "build-wasm-threads-dbg",
"description": "Build with wasi-sdk to create debug wasm",
"inherits": "wasm",
"environment": {
"CMAKE_BUILD_TYPE": "Debug"
},
"cacheVariables": {
"MULTITHREADING": "ON"
}
},
{
"name": "wasm-threads-assert",
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added asserts build for wasm, much faster to build than dbg. Considered making it the default but idk, might affect performance if we use heavy asserts.

"displayName": "Build for WASM with multithreading and and asserts",
"binaryDir": "build-wasm-threads-assert",
"description": "Build with wasi-sdk with asserts",
"inherits": "wasm-threads",
"environment": {
"CMAKE_BUILD_TYPE": "RelWithAssert"
}
},
{
"name": "xray",
"displayName": "Build with multi-threaded XRay Profiling",
Expand Down Expand Up @@ -618,8 +631,15 @@
"targets": ["barretenberg.wasm", "barretenberg", "wasi", "env"]
},
{
"name": "wasm-dbg",
"configurePreset": "wasm-dbg",
"name": "wasm-threads-dbg",
"configurePreset": "wasm-threads-dbg",
"inheritConfigureEnvironment": true,
"jobs": 0,
"targets": ["barretenberg.wasm"]
},
{
"name": "wasm-threads-assert",
"configurePreset": "wasm-threads-assert",
"inheritConfigureEnvironment": true,
"jobs": 0,
"targets": ["barretenberg.wasm"]
Expand Down
22 changes: 17 additions & 5 deletions barretenberg/cpp/src/barretenberg/client_ivc/client_ivc.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -180,6 +180,7 @@ void ClientIVC::accumulate(ClientCircuit& circuit, const std::shared_ptr<Verific
circuit, trace_structure, fold_output.accumulator->proving_key.commitment_key);
}

vinfo("getting honk vk... precomputed?: ", precomputed_vk);
// Update the accumulator trace usage based on the present circuit
trace_usage_tracker.update(circuit);

Expand All @@ -188,11 +189,14 @@ void ClientIVC::accumulate(ClientCircuit& circuit, const std::shared_ptr<Verific
if (mock_vk) {
honk_vk->set_metadata(proving_key->proving_key);
}
vinfo("set honk vk metadata");

// If this is the first circuit in the IVC, use oink to complete the decider proving key and generate an oink proof
if (!initialized) {
OinkProver<Flavor> oink_prover{ proving_key };
vinfo("computing oink proof...");
oink_prover.prove();
vinfo("oink proof constructed");
proving_key->is_accumulator = true; // indicate to PG that it should not run oink on this key
// Initialize the gate challenges to zero for use in first round of folding
proving_key->gate_challenges = std::vector<FF>(CONST_PG_LOG_N, 0);
Expand All @@ -206,7 +210,9 @@ void ClientIVC::accumulate(ClientCircuit& circuit, const std::shared_ptr<Verific
initialized = true;
} else { // Otherwise, fold the new key into the accumulator
FoldingProver folding_prover({ fold_output.accumulator, proving_key }, trace_usage_tracker);
vinfo("constructed folding prover");
fold_output = folding_prover.prove();
vinfo("constructed folding proof");

// Add fold proof and corresponding verification key to the verification queue
verification_queue.push_back(bb::ClientIVC::VerifierInputs{ fold_output.proof, honk_vk, QUEUE_TYPE::PG });
Expand Down Expand Up @@ -295,20 +301,20 @@ ClientIVC::Proof ClientIVC::prove()
};

bool ClientIVC::verify(const Proof& proof,
const std::shared_ptr<VerificationKey>& ultra_vk,
const std::shared_ptr<VerificationKey>& mega_vk,
const std::shared_ptr<ClientIVC::ECCVMVerificationKey>& eccvm_vk,
const std::shared_ptr<ClientIVC::TranslatorVerificationKey>& translator_vk)
{

// Verify the hiding circuit proof
MegaVerifier verifer{ ultra_vk };
bool ultra_verified = verifer.verify_proof(proof.mega_proof);
vinfo("Mega verified: ", ultra_verified);
MegaVerifier verifer{ mega_vk };
bool mega_verified = verifer.verify_proof(proof.mega_proof);
vinfo("Mega verified: ", mega_verified);
// Goblin verification (final merge, eccvm, translator)
GoblinVerifier goblin_verifier{ eccvm_vk, translator_vk };
bool goblin_verified = goblin_verifier.verify(proof.goblin_proof);
vinfo("Goblin verified: ", goblin_verified);
return goblin_verified && ultra_verified;
return goblin_verified && mega_verified;
}

/**
Expand All @@ -331,8 +337,10 @@ bool ClientIVC::verify(const Proof& proof)
*/
HonkProof ClientIVC::decider_prove() const
{
vinfo("prove decider...");
MegaDeciderProver decider_prover(fold_output.accumulator);
return decider_prover.construct_proof();
vinfo("finished decider proving.");
}

/**
Expand All @@ -343,7 +351,11 @@ HonkProof ClientIVC::decider_prove() const
*/
bool ClientIVC::prove_and_verify()
{
auto start = std::chrono::steady_clock::now();
auto proof = prove();
auto end = std::chrono::steady_clock::now();
auto diff = std::chrono::duration_cast<std::chrono::milliseconds>(end - start);
vinfo("time to call ClientIVC::prove: ", diff, " ms.");
return verify(proof);
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -155,7 +155,7 @@ class ClientIVC {
HonkProof construct_and_prove_hiding_circuit();

static bool verify(const Proof& proof,
const std::shared_ptr<VerificationKey>& ultra_vk,
const std::shared_ptr<VerificationKey>& mega_vk,
const std::shared_ptr<ClientIVC::ECCVMVerificationKey>& eccvm_vk,
const std::shared_ptr<ClientIVC::TranslatorVerificationKey>& translator_vk);

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,6 @@ void create_block_constraints(MegaCircuitBuilder& builder,
process_call_data_operations(builder, constraint, has_valid_witness_assignments, init);
// The presence of calldata is used to indicate that the present circuit is a kernel. This is needed in the
// databus consistency checks to indicate that the corresponding return data belongs to a kernel (else an app).
info("ACIR: Setting is_kernel to TRUE.");
builder.databus_propagation_data.is_kernel = true;
} break;
case BlockType::ReturnData: {
Expand Down
55 changes: 55 additions & 0 deletions barretenberg/cpp/src/barretenberg/dsl/acir_proofs/c_bind.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@
#include "barretenberg/dsl/acir_format/acir_format.hpp"
#include "barretenberg/plonk/proof_system/proving_key/serialize.hpp"
#include "barretenberg/plonk/proof_system/verification_key/verification_key.hpp"
#include "barretenberg/serialize/msgpack.hpp"
#include "barretenberg/srs/global_crs.hpp"
#include <cstdint>
#include <memory>
Expand Down Expand Up @@ -218,6 +219,60 @@ WASM_EXPORT void acir_serialize_verification_key_into_fields(in_ptr acir_compose
write(out_key_hash, vk_hash);
}

WASM_EXPORT void acir_prove_and_verify_aztec_client(uint8_t const* acir_stack,
uint8_t const* witness_stack,
bool* verified)
{
using Program = acir_format::AcirProgram;

std::vector<std::vector<uint8_t>> witnesses = from_buffer<std::vector<std::vector<uint8_t>>>(witness_stack);
std::vector<std::vector<uint8_t>> acirs = from_buffer<std::vector<std::vector<uint8_t>>>(acir_stack);
std::vector<Program> folding_stack;

for (auto [bincode, wit] : zip_view(acirs, witnesses)) {
acir_format::WitnessVector witness = acir_format::witness_buf_to_witness_data(wit);
acir_format::AcirFormat constraints =
acir_format::circuit_buf_to_acir_format(bincode, /*honk_recursion=*/false);
folding_stack.push_back(Program{ constraints, witness });
}
// TODO(#7371) dedupe this with the rest of the similar code
// TODO(https://github.com/AztecProtocol/barretenberg/issues/1101): remove use of auto_verify_mode
ClientIVC ivc;
ivc.auto_verify_mode = true;
ivc.trace_structure = TraceStructure::E2E_FULL_TEST;

// Accumulate the entire program stack into the IVC
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just carrying along this todo bc this second was basically a copy and paste from main.cpp.

// TODO(https://github.com/AztecProtocol/barretenberg/issues/1116): remove manual setting of is_kernel once databus
// has been integrated into noir kernel programs
bool is_kernel = false;
auto start = std::chrono::steady_clock::now();
for (Program& program : folding_stack) {
// Construct a bberg circuit from the acir representation then accumulate it into the IVC
vinfo("constructing circuit...");
auto circuit = acir_format::create_circuit<MegaCircuitBuilder>(
program.constraints, false, 0, program.witness, false, ivc.goblin.op_queue);

// Set the internal is_kernel flag based on the local mechanism only if it has not already been set to true
if (!circuit.databus_propagation_data.is_kernel) {
circuit.databus_propagation_data.is_kernel = is_kernel;
}
is_kernel = !is_kernel;

vinfo("done constructing circuit. calling ivc.accumulate...");
ivc.accumulate(circuit);
vinfo("done accumulating.");
}
auto end = std::chrono::steady_clock::now();
auto diff = std::chrono::duration_cast<std::chrono::milliseconds>(end - start);
vinfo("time to construct and accumulate all circuits: ", diff);

vinfo("calling ivc.prove_and_verify...");
bool result = ivc.prove_and_verify();
info("verified?: ", result);

*verified = result;
}

WASM_EXPORT void acir_prove_ultra_honk(uint8_t const* acir_vec,
bool const* recursive,
uint8_t const* witness_vec,
Expand Down
4 changes: 4 additions & 0 deletions barretenberg/cpp/src/barretenberg/dsl/acir_proofs/c_bind.hpp
Original file line number Diff line number Diff line change
Expand Up @@ -49,6 +49,10 @@ WASM_EXPORT void acir_prove_and_verify_mega_honk(uint8_t const* constraint_syste
uint8_t const* witness_buf,
bool* result);

WASM_EXPORT void acir_prove_and_verify_aztec_client(uint8_t const* constraint_system_buf,
uint8_t const* witness_buf,
bool* result);

/**
* @brief Fold and verify a set of circuits using ClientIvc
*
Expand Down
2 changes: 2 additions & 0 deletions barretenberg/cpp/src/barretenberg/eccvm/eccvm_prover.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -187,6 +187,8 @@ void ECCVMProver::execute_pcs_rounds()

// Produce another challenge passed as input to the translator verifier
translation_batching_challenge_v = transcript->template get_challenge<FF>("Translation:batching_challenge");

vinfo("computed opening proof");
}

HonkProof ECCVMProver::export_proof()
Expand Down
4 changes: 3 additions & 1 deletion barretenberg/cpp/src/barretenberg/eccvm/eccvm_verifier.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -65,6 +65,7 @@ bool ECCVMVerifier::verify_proof(const HonkProof& proof)
sumcheck.verify(relation_parameters, alpha, gate_challenges);
// If Sumcheck did not verify, return false
if (sumcheck_verified.has_value() && !sumcheck_verified.value()) {
vinfo("eccvm sumcheck failed");
return false;
}
// Compute the Shplemini accumulator consisting of the Shplonk evaluation and the commitments and scalars vector
Expand Down Expand Up @@ -127,7 +128,8 @@ bool ECCVMVerifier::verify_proof(const HonkProof& proof)

const bool batched_opening_verified =
PCS::reduce_verify(key->pcs_verification_key, batch_opening_claim, transcript);

vinfo("eccvm sumcheck verified?: ", sumcheck_verified.value());
vinfo("batch opening verified?: ", batched_opening_verified);
return sumcheck_verified.value() && batched_opening_verified;
}
} // namespace bb
Loading
Loading