-
Notifications
You must be signed in to change notification settings - Fork 170
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improper interpretation for big token amounts #1900
Comments
@kushti I'm not quite sure what am I supposed to do here, do we want to have lower Max limit for tokens so that it does not cause problems in Javascript ? |
@pragmaxim No, we can not lower down limits. If fixing JSON decoders not possible, maybe let decode accept number or string and change openapi.yaml description to suggest strings by default . Article above suggests to fix codecs |
@kushti Where does this happen? Sending tx? As we use Circe for ser/deser and it does not have this problem, I just tried having tokens with |
@pragmaxim that was input for /wallet/transactions/sign, but I guess other methods using assets decoder have the same issue. As a test, have you tried to send Long.MaxValue? Was exactly the same value sent ? |
@kushti I changed the Token generators in
|
Do you send tx as a string to be parsed as JSON, or ready-made JSON? Maybe you can share the test? |
@kushti Yeah d64a246#diff-693aa4b3e942c2e0a731d61a70b822ef9477d0e0229d413cdedfd6ae88ce6b70R107 I modified Generators to return |
@pragmaxim maybe swagger issue then swagger-api/swagger-ui#7478 . Update to 3.0.3 should help it seems |
@kushti Never done this, something like |
This comment was marked as spam.
This comment was marked as spam.
@abebeos Please go on, just remember that this is just a swagger issue, not Node issue. |
This comment was marked as spam.
This comment was marked as spam.
I would like to work on this |
Example:
Instead of
9223371997709984332
for asset303f39026572bcb4060b51fafc93787a236bb243744babaa99fceb833d61e198
in the first output, parser returns another value. Seems like JSON big int issue like , see e.g. https://jsoneditoronline.org/indepth/parse/why-does-json-parse-corrupt-large-numbers/The text was updated successfully, but these errors were encountered: