Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

&max-size has effect on how much data is consumed #1668

Closed
bbannier opened this issue Feb 5, 2024 · 0 comments
Closed

&max-size has effect on how much data is consumed #1668

bbannier opened this issue Feb 5, 2024 · 0 comments
Assignees
Labels
Bug Something isn't working

Comments

@bbannier
Copy link
Member

bbannier commented Feb 5, 2024

I would expect &max-size to inject a post-condition into parsing, but not have an effect on how much data is consumed (unless there is an error). This is not the case, e.g., in below reproducer regardless of whether &max-size was used or not, rest should always be b"BC",

# @TEST-EXEC: printf '\x01_BC' | spicy-dump -d %INPUT >>output2 2>&1
# @TEST-EXEC: printf '\x00_BC' | spicy-dump -d %INPUT >>output2 2>&1

module foo;

public type X = unit {
    use_max_size: uint8 &convert=cast<bool>($$);

    switch (self.use_max_size) {
        True -> xs: bytes &until=b"_" &max-size=1;
        False -> xs: bytes &until=b"_";
    };

    rest: bytes &eod;
};

Instead the following output is produced:

foo::X {
  use_max_size: True
  xs:
  rest: C
}
foo::X {
  use_max_size: False
  xs:
  rest: BC
}

The error is in how &max-size sets the next position,

pstate.ncur = builder()->addTmp("ncur", builder::memberCall(state().cur, "advance", {*length}));

Contrary to what we currently do, &max-size is completely different from &size wrt. ncur which depends on what was actually parsed and cannot be known or set before.

I came across this while working on #1652.

@bbannier bbannier added the Bug Something isn't working label Feb 5, 2024
@bbannier bbannier self-assigned this Feb 5, 2024
bbannier added a commit that referenced this issue Feb 5, 2024
We would previously handle `&size` and `&max-size` almost identical
with the only difference that `&max-size` sets up a slightly larger view
to accomodate a sentinel. In particular, we also used identical code to
set up the position where parsing should resume after such a field.

This was incorrect as it is in general impossible to tell where parsing
continues after a field with `&max-size` since it does not signify a
fixed view like `&size`. In this patch we now compute the next position
for a `&max-size` field by inspecting the limited view to detect how
much data was extracted.

Closes #1668.
bbannier added a commit that referenced this issue Feb 5, 2024
We would previously handle `&size` and `&max-size` almost identical
with the only difference that `&max-size` sets up a slightly larger view
to accomodate a sentinel. In particular, we also used identical code to
set up the position where parsing should resume after such a field.

This was incorrect as it is in general impossible to tell where parsing
continues after a field with `&max-size` since it does not signify a
fixed view like `&size`. In this patch we now compute the next position
for a `&max-size` field by inspecting the limited view to detect how
much data was extracted.

Closes #1668.
bbannier added a commit that referenced this issue Feb 7, 2024
We would previously handle `&size` and `&max-size` almost identical
with the only difference that `&max-size` sets up a slightly larger view
to accomodate a sentinel. In particular, we also used identical code to
set up the position where parsing should resume after such a field.

This was incorrect as it is in general impossible to tell where parsing
continues after a field with `&max-size` since it does not signify a
fixed view like `&size`. In this patch we now compute the next position
for a `&max-size` field by inspecting the limited view to detect how
much data was extracted.

Closes #1668.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant