Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: Prevent infinite loop #90

Closed
wants to merge 2 commits into from
Closed

fix: Prevent infinite loop #90

wants to merge 2 commits into from

Conversation

kdy1
Copy link
Contributor

@kdy1 kdy1 commented Jun 14, 2024

Context: swc-project/swc#9050

  • To show the proof for the fix, I made the SWC PR draft and used kdy1/rust-sourcemap.git#skip-range

I used adjust_mapping, but I found sourcemap crate is panicking with integer overflow. The cause was

            while token.get_dst_line() != prev_dst_line {
                rv.push(';');
                prev_dst_line += 1;
            }

so I modified it to

            while token.get_dst_line() > prev_dst_line {
                rv.push(';');
                prev_dst_line += 1;
            }

and it worked.

@kdy1 kdy1 marked this pull request as ready for review June 14, 2024 13:41
@loewenheim
Copy link
Contributor

Hi, can you post an example sourcemap for which this change makes a difference? Intuitively it shouldn't, now I'm wondering if there's something wrong with our logic.

Copy link
Member

@Swatinem Swatinem left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the change itself looks good to me, though I believe this only papers over a bug that exists elsewhere.

I remember there was a similar case where it would just iterate forever (or rather, until the int wraps around and catches up) when you add tokens out of order.
The problem there being that the serialization step assumes the API user adds tokens in order, and out of order tokens result in serialization being messed up.

The better solution there would actually be to ensure tokens are properly sorted. And to remove that API footgun that assumes the user does the right thing, one way would be to just sort the token list prior to serialization.

@loewenheim
Copy link
Contributor

@Swatinem Yeah, I assumed it was something like this. Out-of-order tokens have never made sense and just shouldn't exist.

@kdy1
Copy link
Contributor Author

kdy1 commented Jun 14, 2024

I did orig_js_map.adjust_mappings(transform_js_map). Both of them are valid so I could print them, but the resulting sourcemap panics on .to_writer.

Source map files are uploaded at https://gist.github.com/kdy1/000301fc6091b3e8d5aec7cef8e89b80

Exact code:

        let mut map = builder.into_sourcemap();

        let mut map_s = vec![];
        map.to_writer(&mut map_s).unwrap();
        std::fs::write("transform.js.map", map_s).unwrap();

        if let Some(orig) = orig {
            let mut map_s = vec![];
            orig.to_writer(&mut map_s).unwrap();
            std::fs::write("orig.js.map", map_s).unwrap();

            let mut copied = orig.clone();
            copied.adjust_mappings(&map);
            map = copied;
        }

        map.to_writer(&mut vec![]).unwrap(); // Fails with a panic caused by an overflow of integer

        map

@loewenheim
Copy link
Contributor

@kdy1 Right. I can confirm that #91 would fix your issue. As @Swatinem says, your PR would fix the infinite loop, but likely not solve the problem—the sourcemap you get out is probably going to be nonsense.

@kdy1
Copy link
Contributor Author

kdy1 commented Jun 15, 2024

Thank you! I agree that my fix is wrong. I tried reproducing it with a single source map but it seems like my fix simply removes erroneous tokens from the output

@kdy1 kdy1 closed this Jun 15, 2024
@kdy1 kdy1 deleted the skip-range branch June 15, 2024 01:24
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants