-
Notifications
You must be signed in to change notification settings - Fork 206
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Slice generated from a generic array is accessed outside of its capacity #4722
Comments
This might be related, if I change the hash_args implementation to iterate over the length of the slice: pub fn hash_args(args: [Field]) -> Field {
if args.len() == 0 {
0
} else {
assert(args.len() < ARGS_HASH_CHUNK_COUNT * ARGS_HASH_CHUNK_LENGTH);
let mut chunks_hashes = [0; ARGS_HASH_CHUNK_COUNT];
let mut current_chunk = [0; ARGS_HASH_CHUNK_LENGTH];
let mut hash_index = 0;
let mut chunk_index = 0;
for i in 0..args.len() {
current_chunk[chunk_index] = args[i];
chunk_index+=1;
if chunk_index == ARGS_HASH_CHUNK_LENGTH {
chunks_hashes[hash_index] = pedersen_hash(current_chunk, 44);
current_chunk = [0; ARGS_HASH_CHUNK_LENGTH];
hash_index+=1;
chunk_index = 0;
}
}
if chunk_index > 0 {
chunks_hashes[hash_index] = pedersen_hash(current_chunk, 44);
}
pedersen_hash(chunks_hashes, 44)
}
} I get a could not determine loop bound at compile time error
However, the loop bound should be known since the slice length should be the original array length from this function: pub fn hash_args_array<N>(args: [Field; N]) -> Field {
hash_args(args.as_slice())
} Seems like we are losing information at some point |
Deleted initial comment as I was just compiling rather than executing. I'd expect that the second example would fail to compile. |
Ah, we do actually as |
Yes, the length of the array is known at the ssa level afaict, I did put a log in capacity tracker and it seems to know the capacity of the slice being one when returning from as_slice() |
I think the issue is that as_slice() returns an unknown variable length, and a slice with capacity N. Since the length is unknown, the compiler cannot figure out that this evaluates to false |
So it's kind of related with the second issue: if the length returned from the intrinsic as_slice() was known at the SSA level, then both programs would compile & execute |
Minimal reproduction: unconstrained fn return_array(val: Field) -> [Field; 1] {
[val; 1]
}
fn main(val: Field) {
let array = return_array(val);
assert_constant(array.as_slice().len());
} |
I can add a pass in SSA to make it aware of slice lengths returning from as_slice. I'm testing it now and it seems to work fine. I'll make a PR to see what you guys think :D. I don't know if it can be added in SSA gen, I tried there at first but it seems like it's not easy to transform the as_slice call to return a constant and the slice, but maybe I'm missing something. |
Thanks, yeah that would be helpful.
|
# Description ## Problem\* Resolves #4722 ## Summary\* ## Additional Context ## Documentation\* Check one: - [x] No documentation needed. - [ ] Documentation included in this PR. - [ ] **[For Experimental Features]** Documentation to be submitted in a separate PR. # PR Checklist\* - [ ] I have tested the changes locally. - [ ] I have formatted the changes with [Prettier](https://prettier.io/) and/or `cargo fmt` on default settings. --------- Co-authored-by: Tom French <[email protected]>
Aim
Be able to run code like this:
Prover.toml
Expected Behavior
Program should execute successfully, and should be simplified taking into account the array length and slice capacity (hash_args should be simplified to
pedersen_hash([pedersen_hash([single_return_value, 0, 0...]), 0, 0, 0...]);
Bug
It appears that the slice is accessed outside of its capacity (that should be 1)
To Reproduce
Project Impact
None
Impact Context
No response
Workaround
None
Workaround Description
No response
Additional Context
No response
Installation Method
None
Nargo Version
No response
NoirJS Version
No response
Would you like to submit a PR for this Issue?
None
Support Needs
No response
The text was updated successfully, but these errors were encountered: