Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pre-compute size of serialized data #46

Closed
x37v opened this issue Oct 9, 2021 · 6 comments · Fixed by #86
Closed

pre-compute size of serialized data #46

x37v opened this issue Oct 9, 2021 · 6 comments · Fixed by #86

Comments

@x37v
Copy link

x37v commented Oct 9, 2021

I'm using the slice flavor, I'm wondering if it is possible to compute the size of the serialized data without actually serializing?

@gauteh
Copy link

gauteh commented May 29, 2022

Hi,

I am trying out postcard as a format for storing data on a SD-card. There will be a large number of data-packages so they will exhaust the number of files allowed on FAT32. embedded-sdmmc does not (yet) support creating directories, so I can't solve it by putting the files in dirs. So now I am considering putting many packages in each file. I will use COBS between the packages, but I need to know the serialized size of the packages. The data-package is a fixed size (no variable length fields): https://github.com/gauteh/sfy/blob/main/sfy-buoy/src/axl.rs#L10 .

  • Can I compute the size of a serialized package?
  • If I determine it experimentally, is it stable? Will it be the the exact same size regardless of the struct contents?

Regards, Gaute

@ryankurte
Copy link

heya, i have a very similar problem and i'm not sure the macros cover the ability to pre-compute size when serializing? (or, i have missed how this can be used to achieve this).

i'm looking for something like postcard::encoded_len<T: Serialize>(t: &T) -> usize / equivalent to prost's Message::encoded_len(), that's basically a no-op encode outputting the buffer size required.

@dignifiedquire
Copy link
Contributor

@jamesmunns
Any updates here? I am running into an issue with this, for preallocating buffers when serializing. The reason MAX_SIZE doesn't work is because I have &[u8] and String in my data. Maybe there could be a way to calculate the size based on MAX_SIZE + .len() dynamically?

@jamesmunns
Copy link
Owner

Reopening this, as there are actually a few different things going on here:

  1. Statically computing the MAX size for a given type (for Ser/Deser)
    • This is possible already with the MAX_SIZE derive, BUT
    • This DOESN'T work for !Sized items, like slices or heap allocations. More below.
  2. Dynamically computing the SPECIFIC serialized size of an object (only for Ser).

I think you are asking for item 2 above @dignifiedquire, do you want to see if that branch works for your needs, and if so you could help get that clean PR opened?

@jamesmunns
Copy link
Owner

jamesmunns commented Jan 25, 2023

Oops, forgot my "more below" part:

We COULD extend the MAX derive to take attributes on certain items, like:

#[derive(Serialize, Deserialize, MaxSize)]
struct Demo<'a> {
    arr: [Foo; 5],
    #[postcard(max_hint = 128)]
    sli: &'a [Bar],
    #[postcard(max_hint = 64)]
    hst: String
}

This covers the !Sized case I described above as 1b.

@dignifiedquire
Copy link
Contributor

@jamesmunns just created #86 which should fix things up

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants