-
Notifications
You must be signed in to change notification settings - Fork 3.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Proto representation of types.Dec unintuitive and inconsistent #10863
Comments
Oh no 😱. So I think this should basically classified as a bug. It appears that the proto JSON is to actually use a decimal place (i.e. So I'm not really sure what to do. Fixing this on master would be state breaking and require migrations, and also break clients. At the same time, the current encoding is just simply wrong... We are planning to switch decimal implementations over to something that is GDA compliant, so an alternative would be to consider this
The struct approach may be a good way to go, but I'm not really sure it's preferable over a string. An alternative could be an int64 mantissa with an int32 exponent which should provide plenty of precision for most use cases. |
Thank you for your understanding and support on that issue, Aaron!
Same here
Sounds plausible
The string is more beginner friendly for sure. Also in many cases devs don't bother about the difference between decimals and floats. And it is probably okay for a lot of use cases such as displaying a rounded inflation rate. I was kindof leaning towards the struct to create a hard type break that hurts at compile time rather than silently changing string content from one encoding to another. But there are probably better migration strategies.
Going from fixed point (18 decimals) to floating point would heavily increase the complexity of the type, especially if you want to do math between types of different exponent. Not sure if that is needed. The feedback we got in CosmWasm is that everybody is happy with 18 decimal points fixed but uint128 is not sufficient for some Ethereum-related use cases. A uint256 integer with a fixed decimal point at 18 seems to make everyone happy. |
In https://github.com/cosmos/cosmjs/pull/969/files you see |
For the new SDK decimal type, my thinking was to allow users to choose what precision they do calculations with and store as an arbitrary precision string. In what use cases do people actually need a decimal value that can't fit into a uint128? That would be a very big number and I'm not sure what real world number would be that large and also require 18 decimal digits. One hacky solution could be to parse any decimal strings that contain a |
Okay, so I guess the important question is if you ever want to support math with mixed exponents. E.g. if you do
Defi stuff I understand very little about. See e.g. Decimal256 in TerraSwap. Other sane community members like @marbar3778 confirmed Decimal256 is something that is needed.
You are taking about client code consuming those things? Yeah it would be nice if the new string representation was guaranteed to have a decimal point, such that an integer cannot be the old representation and a decimal with zero fractional part at the same time. |
Hmm... I think we would just do the normal thing that GDA specifies and work with different exponents and specify a rounding precision and flags. We should require that inputs adhere to a precision limit (we do this in the regen-ledger credits module using a GDA implementation), but intermediate calculations may mix exponents depending on how the decimal library represents things. |
This is super important for clients that support backwards compatibility - being able to maintain the difference between the old type and the new type. |
@tac0turtle @aaronc Is it realistic for 0.48 to do a SDK wide state-machine-breaking migration to update all decimals to a string representation with decimal point? |
We're facing the same issue on the cosmos-hub - affected versions: In our case the behaviour could lead to overspends and similar types of issues. Here is our scenario: The Hub now uses the feemarket module that specifies gas prices using # RPC
gaiad q feemarket gas-price uatom --node "<RPC>" -o json | jq .
{
"price": {
"denom": "uatom",
"amount": "0.005000000000000000"
}
} # grpc
grpcurl -plaintext cosmos-grpc.polkachu.com:14990 feemarket.feemarket.v1.Query.GasPrices
{
"prices": [
{
"denom": "uatom",
"amount": "5000000000000000"
}
]
} This could lead to unforseen issues when estimating gas as developers are not aware that they should convert the The same issue exists in all places where
In this case, the RPC returns: gaiad q distribution community-pool --node <RPC> -o json | jq .
{
"pool": [
{
"denom": "uatom",
"amount": "5632244209811.957774374296404803"
}
]
} The suggestion to divide by Is there a solution to this? |
ref: #20803 |
What you describe @MSalopek is expected given the issue in this ticket. The protobuf storage and API store those values as integers which is caused by the underlying decimal implementation. Changing this would not only break APIs but also all state where the data type is used. The only possible solution I am aware of to create a different decimal type with a strictly specified string representation and gradually migrate from the old to the new one. The Amino JSON encoding of this is a separate issue (#18546 and friends). In Amino JSON the dot was used to be added but than the JSON representation changed somehow.
What I did in CosmJS is string mutation as you know this is 18 digit fixed point and can easily exceed the 64bit range. |
I did some digging, this difference in JSON wire format vs proto format has been in place since v0.45: ❯ curl -s https://cosmos-api.polkachu.com/cosmos/mint/v1beta1/params | jq
{
"params": {
"mint_denom": "uatom",
"inflation_rate_change": "1.000000000000000000",
"inflation_max": "0.100000000000000000",
"inflation_min": "0.070000000000000000",
"goal_bonded": "0.670000000000000000",
"blocks_per_year": "4360000"
}
}
❯ grpcurl -plaintext cosmos-grpc.polkachu.com:14990 cosmos.mint.v1beta1.Query.Params
{
"params": {
"mintDenom": "uatom",
"inflationRateChange": "1000000000000000000",
"inflationMax": "100000000000000000",
"inflationMin": "70000000000000000",
"goalBonded": "670000000000000000",
"blocksPerYear": "4360000"
}
} The proto format (on disk and on network) does not contain a radix point and is omitted, the Clearly this is super confusing. The only reasonable fix I can imagine from the SDK side would be to offer a feature flag which (if set) Marshals LegacyDec to bytes containing a radix point, and Unmarshals both formats identically. Functionally it would be lazy migration of decimals in state, and is state breaking / requires a coordinated chain upgrade, so off by default. It's a lot of machinery for a decimal point. On the other hand, Amino JSON rendering could also just be dropped from JSON RPC but this would be pretty breaking in its own right, so I'm inclined to leave it as is. We're close to coming to consensus on a wire format for GDA based decimal type which we'll recommend new proto specs use. |
Right now, a
decimal
of typetypes.Dec
is encoded in the protobuf API asstring(int(decimal * 10**18))
orascii(string(int(decimal * 10**18)))
. While this is straight forward and efficient once you know how decimals work, it's very unintuitive if you look at the output the API gives you, such asor
In both cases the example values as well as the non-Go proto definitions do not provide hints that those are encoded decimals.
Then those decimals are sometimes encoded as strings, sometimes as bytes. E.g.
and
This hits all users that interact with the proto interface directly and do not use a higher level wrapper. Given the amount of deeply nested fields that would need wrapping, the need for custom module code generation and the desire to use reflection, this is a severe burden to DevX that will affect a lot of newcomers.
I think there needs to be a better strategy how to encode
types.Dec
at the protobuf API level. One option is using string representation with decimal points. The other option that avoids string parsing is asimilar to how Timestamp is defined.
The text was updated successfully, but these errors were encountered: