You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When reading DEP-0007 I noticed that the dataStructureType can be any string.
(of varint data-size: big) However: most file signatures are fixed in size, which makes me wonder if it wouldn't be a good idea to amend a limit of "how big the size can be" in order to prevent the downloading of a lot of data before checking if the data even fits the structure.
The text was updated successfully, but these errors were encountered:
As a weakly held, somewhat subjective opinion, I think we should err on the side of "keeping it simple" by not specifying maximum lengths for every string and field at this level of the format/protocol (AKA, fields inside protobuf). We could add a statement like "use good judgement".
A couple unstructured thoughts:
when reading from disk, there are already fixed-size headers with metadata about the format (just not the hypercore content type)
if we wanted a limit, it would probably make more sense to apply it to the entire protobuf message, instead of to individual fields inside it. AFAIK there's an implied size limit of 2GB to protobuf messages in many implementations, and in hypercore we recommend message sizes under a megabyte (or 10? reference needed)
I might feel differently if protobuf schemas themselves allowed setting size limits on fields (which IIRC they do not). As it doesn't, every implementation would need to implement additional verification/validation, which is extra work and complexity and needs testing.
to me at least, the closer analog to dataStructureType is MIME types, especially when embedded in HTTP headers as Content-Type, which as far as I know has no maximum length.
When reading DEP-0007 I noticed that the
dataStructureType
can be any string.(of varint data-size: big) However: most file signatures are fixed in size, which makes me wonder if it wouldn't be a good idea to amend a limit of "how big the size can be" in order to prevent the downloading of a lot of data before checking if the data even fits the structure.
The text was updated successfully, but these errors were encountered: