Skip to content

Commit

Permalink
[SPARK-8359] [SQL] Fix incorrect decimal precision after multiplication
Browse files Browse the repository at this point in the history
JIRA: https://issues.apache.org/jira/browse/SPARK-8359

Author: Liang-Chi Hsieh <[email protected]>

Closes #6814 from viirya/fix_decimal2 and squashes the following commits:

071a757 [Liang-Chi Hsieh] Remove maximum precision and use MathContext.UNLIMITED.
df217d4 [Liang-Chi Hsieh] Merge remote-tracking branch 'upstream/master' into fix_decimal2
a43bfc3 [Liang-Chi Hsieh] Add MathContext with maximum supported precision.
72eeb3f [Liang-Chi Hsieh] Merge remote-tracking branch 'upstream/master' into fix_decimal2
44c9348 [Liang-Chi Hsieh] Fix incorrect decimal precision after multiplication.
  • Loading branch information
viirya authored and Davies Liu committed Jun 23, 2015
1 parent d4f6335 commit 31bd306
Show file tree
Hide file tree
Showing 2 changed files with 9 additions and 2 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,8 @@

package org.apache.spark.sql.types

import java.math.{MathContext, RoundingMode}

import org.apache.spark.annotation.DeveloperApi

/**
Expand Down Expand Up @@ -137,9 +139,9 @@ final class Decimal extends Ordered[Decimal] with Serializable {

def toBigDecimal: BigDecimal = {
if (decimalVal.ne(null)) {
decimalVal
decimalVal(MathContext.UNLIMITED)
} else {
BigDecimal(longVal, _scale)
BigDecimal(longVal, _scale)(MathContext.UNLIMITED)
}
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -162,4 +162,9 @@ class DecimalSuite extends SparkFunSuite with PrivateMethodTester {
assert(new Decimal().set(100L, 10, 0).toUnscaledLong === 100L)
assert(Decimal(Long.MaxValue, 100, 0).toUnscaledLong === Long.MaxValue)
}

test("accurate precision after multiplication") {
val decimal = (Decimal(Long.MaxValue, 38, 0) * Decimal(Long.MaxValue, 38, 0)).toJavaBigDecimal
assert(decimal.unscaledValue.toString === "85070591730234615847396907784232501249")
}
}

5 comments on commit 31bd306

@JihongMA
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This fix is causing issue with divide over Decimal.Unlimited type when precision and scale are not defined. as shown below, please revisit this fix.

java.lang.ArithmeticException: Non-terminating decimal expansion; no exact representable decimal result.
at java.math.BigDecimal.divide(BigDecimal.java:1616)
at java.math.BigDecimal.divide(BigDecimal.java:1650)
at scala.math.BigDecimal.$div(BigDecimal.scala:256)
at org.apache.spark.sql.types.Decimal.$div(Decimal.scala:269)
at org.apache.spark.sql.types.Decimal$DecimalIsFractional$.div(Decimal.scala:333)
at org.apache.spark.sql.types.Decimal$DecimalIsFractional$.div(Decimal.scala:332)
at org.apache.spark.sql.catalyst.expressions.Divide$$anonfun$div$1.apply(arithmetic.scala:193)
at org.apache.spark.sql.catalyst.expressions.Divide.eval(arithmetic.scala:206)

@sujkh85
Copy link

@sujkh85 sujkh85 commented on 31bd306 Jun 27, 2015 via email

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@davies
Copy link
Contributor

@davies davies commented on 31bd306 Jun 27, 2015

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JihongMA How to reproduce this?

@JihongMA
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

here is a simple repo script:

case class DecimalData(a: BigDecimal, b: BigDecimal);
val decimalData = sc.parallelize( DecimalData(1, 1) :: DecimalData(1, 2) :: DecimalData(9,1) :: Nil).toDF()
decimalData.agg(avg('a)).collect().foreach(println)

@davies
Copy link
Contributor

@davies davies commented on 31bd306 Jun 29, 2015

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should be fixed by 24fda73, could you help to confirm it?

Please sign in to comment.