Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-3359][DOCS] Make javadoc8 working for unidoc/genjavadoc compatibility in Java API documentation #16013

Closed
wants to merge 17 commits into from

Conversation

HyukjinKwon
Copy link
Member

@HyukjinKwon HyukjinKwon commented Nov 25, 2016

What changes were proposed in this pull request?

This PR make sbt unidoc complete with Java 8.

This PR roughly includes several fixes as below:

  • Fix unrecognisable class and method links in javadoc by changing it from [[..]] to `...`

    - * A column that will be computed based on the data in a [[DataFrame]].
    + * A column that will be computed based on the data in a `DataFrame`.
  • Fix throws annotations so that they are recognisable in javadoc

  • Fix URL links to <a href="http..."></a>.

    - * [[http://en.wikipedia.org/wiki/Decision_tree_learning Decision tree]] model for regression.
    + * <a href="http://en.wikipedia.org/wiki/Decision_tree_learning">
    + * Decision tree (Wikipedia)</a> model for regression.
    -   * @see http://en.wikipedia.org/wiki/Receiver_operating_characteristic
    +   * @see <a href="http://en.wikipedia.org/wiki/Receiver_operating_characteristic">
    +   * Receiver operating characteristic (Wikipedia)</a>
  • Fix < to > to

  • Fix </p> complaint

How was this patch tested?

Manually tested by jekyll build with Java 7 and 8

java version "1.7.0_80"
Java(TM) SE Runtime Environment (build 1.7.0_80-b15)
Java HotSpot(TM) 64-Bit Server VM (build 24.80-b11, mixed mode)
java version "1.8.0_45"
Java(TM) SE Runtime Environment (build 1.8.0_45-b14)
Java HotSpot(TM) 64-Bit Server VM (build 25.45-b02, mixed mode)

@HyukjinKwon
Copy link
Member Author

HyukjinKwon commented Nov 25, 2016

cc @srowen, this PR finally resolves the issue and makes sbt unidoc working fine with Java 8! Let me try to double check and leave some comments tomorrow.

@HyukjinKwon HyukjinKwon changed the title [WIP][SPARK-3359][DOCS] Make javadoc8 working for unidoc/genjavadoc compatibility in API Java documentation [WIP][SPARK-3359][DOCS] Make javadoc8 working for unidoc/genjavadoc compatibility in Java API documentation Nov 25, 2016
@@ -262,7 +262,7 @@ class SparkConf(loadDefaults: Boolean) extends Cloneable with Logging with Seria
/**
* Get a time parameter as seconds; throws a NoSuchElementException if it's not set. If no
* suffix is provided then seconds are assumed.
* @throws NoSuchElementException
* @note Throws `NoSuchElementException`
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, I think this may be resolved if java.util.NoSuchElementException is imported, or if the fully qualified name is used here. I favor the latter. That would be better; does it work?

@SparkQA
Copy link

SparkQA commented Nov 25, 2016

Test build #69169 has finished for PR 16013 at commit 3660317.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Nov 25, 2016

Test build #69171 has finished for PR 16013 at commit 22bfb68.

  • This patch fails PySpark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@@ -26,7 +26,7 @@ package org.apache.spark
*
* An accumulator is created from an initial value `v` by calling
* [[SparkContext#accumulator SparkContext.accumulator]].
* Tasks running on the cluster can then add to it using the [[Accumulable#+= +=]] operator.
* Tasks running on the cluster can then add to it using the `+=` operator.
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just decided to keep original format rather than trying to make this pretty.

The original was as below:

  • Scala
    2016-11-26 12 46 33

  • Java
    2016-11-26 12 46 46

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After this PR it still prints the same.

  • Scala

    2016-11-26 12 51 56
  • Java

    2016-11-26 12 51 38

@@ -262,8 +262,9 @@ class SparkConf(loadDefaults: Boolean) extends Cloneable with Logging with Seria
/**
* Get a time parameter as seconds; throws a NoSuchElementException if it's not set. If no
* suffix is provided then seconds are assumed.
* @throws NoSuchElementException
* @throws java.util.NoSuchElementException
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is interesting. Using @throws NoSuchElementException complains as below:

[error]   location: class VectorIndexerModel
[error] .../java/org/apache/spark/SparkConf.java:226: error: reference not found
[error]    * @throws NoSuchElementException
[error]      ^

@@ -2063,6 +2063,7 @@ class SparkContext(config: SparkConf) extends Logging {
* @param jobId the job ID to cancel
* @throws InterruptedException if the cancel message cannot be sent
Copy link
Member Author

@HyukjinKwon HyukjinKwon Nov 26, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is interesting too. This does not throws an error in javadoc8 unlike https://github.com/apache/spark/pull/16013/files#r89664921

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems fine

  • Scala

    2016-11-26 1 05 57
  • Java

    2016-11-26 1 05 52

*/
@throws(classOf[NoSuchElementException])
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Before

  • Scala

    2016-11-26 12 54 20
  • Java

2016-11-26 12 55 12

After

  • Scala

    2016-11-26 12 55 55
  • Java

    2016-11-26 12 55 30

@@ -155,7 +155,7 @@ class DoubleRDDFunctions(self: RDD[Double]) extends Logging with Serializable {
* to the right except for the last which is closed
* e.g. for the array
* [1, 10, 20, 50] the buckets are [1, 10) [10, 20) [20, 50]
* e.g 1<=x<10 , 10<=x<20, 20<=x<=50
* e.g 1&lt;=x&lt;10 , 10&lt;=x&lt;20, 20&lt;=x&lt;=50
Copy link
Member Author

@HyukjinKwon HyukjinKwon Nov 26, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This originally gives an error as below

[error] .../java/org/apache/spark/rdd/DoubleRDDFunctions.java:73: error: malformed HTML
[error]    *  e.g 1<=x<10, 10<=x<20, 20<=x<=50
[error]            ^
[error] .../java/org/apache/spark/rdd/DoubleRDDFunctions.java:73: error: malformed HTML
[error]    *  e.g 1<=x<10, 10<=x<20, 20<=x<=50
[error]               ^
[error] .../java/org/apache/spark/rdd/DoubleRDDFunctions.java:73: error: malformed HTML
[error]    *  e.g 1<=x<10, 10<=x<20, 20<=x<=50
[error]                      ^
...

However, after fixing it as above,

This is being printed as they are in javadoc (not in scaladoc)

2016-11-26 1 14 44

It seems we should find another approach to deal with this. It seems &#60 and &#62 also do not work. It seems & is always converted into &amp;.

Copy link
Member Author

@HyukjinKwon HyukjinKwon Nov 26, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In generated javadoc,

  • these throw errors

    <
    <code>.. < .. </code>
    <blockquote> ... < .. </blockquote>
    `... < ..`
    {{{... < ..}}}

    as below:

    [error] .../java/org/apache/spark/rdd/DoubleRDDFunctions.java:71: error: malformed HTML
    [error]    * < 
    
    [error] .../java/org/apache/spark/rdd/DoubleRDDFunctions.java:72: error: malformed HTML
    [error]    * <code>.. < .. </code>
    
    [error] .../java/org/apache/spark/rdd/DoubleRDDFunctions.java:73: error: malformed HTML
    [error]    * <blockquote> ... < .. </blockquote>
    
    [error] .../java/org/apache/spark/rdd/DoubleRDDFunctions.java:74: error: malformed HTML
    [error]    * <code>... < ..</code>
    
    [error] .../java/org/apache/spark/rdd/DoubleRDDFunctions.java:75: error: malformed HTML
    [error]    * <pre><code>... < ..</code></pre>
    
  • These do not print < but &lt;.

    &lt;
    <code>.. &lt; .. </code>
    <blockquote> ... &lt; .. </blockquote>
    `... &lt; ..`
    {{{... &lt; ..}}}

    as below:

    2016-11-26 3 54 45
  • The below one is fine

    {{{
    1<=x<10 , 10<=x<20, 20<=x<=50
    }}}
    

    but newlines are inserted as below:

    2016-11-26 3 50 57

Copy link
Member Author

@HyukjinKwon HyukjinKwon Nov 26, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

note to myself, It seems inlined tags such as

{@code ... < ...}

and

{@literal >}

work also okay for both but they are valid ones for javadoc. For scaladoc, they are dealt with monospace text (like `<` or `... < ...`). As genjavadoc seems not replacing it, it seems they work apparently okay. I guess we should avoid those though.

@SparkQA
Copy link

SparkQA commented Nov 26, 2016

Test build #69179 has finished for PR 16013 at commit 73fcd35.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@@ -103,7 +103,8 @@ class JavaRDD[T](val rdd: RDD[T])(implicit val classTag: ClassTag[T])
* @param withReplacement can elements be sampled multiple times (replaced when sampled out)
* @param fraction expected size of the sample as a fraction of this RDD's size
* without replacement: probability that each element is chosen; fraction must be [0, 1]
* with replacement: expected number of times each element is chosen; fraction must be &gt;= 0
* with replacement: expected number of times each element is chosen; fraction must be greater
* than or equal to 0
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can work around this to {@code >=} if this looks too verbose.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Prose is fine too, but, was this an error? is the point that the escape itself gets escaped? I might have missed this is your various comments.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, yes. Exactly. It was not an error but it was printed as they are (&gt; as &amp;gt;). I had a hard time to figure this out in #16013 (comment)

I haven't looked into this super deeper as the output is incorrect anyway but I suspect this replacement in genjavadoc.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In short, the only workarounds to print them I found were inlined tags such as {@code ...} and {{{...}}} block with newlines.

* e.g. for the array
* [1, 10, 20, 50] the buckets are [1, 10) [10, 20) [20, 50]
* e.g 1<=x<10 , 10<=x<20, 20<=x<=50
* }}}
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • Scala
    2016-11-27 3 06 52

  • Java
    2016-11-27 3 06 47

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why put the whole section as code? it's just the portion with several inequalities. Can those be back-tick-escaped? Or is the point that the back-ticks don't work? does {@code ...} work with < / > ?

BTW, very minor, but the second "e.g" misses a period. "e.g." isn't really great either. You could expand the first to "For example" and the second to "that is", which was really what was meant.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just wanted to avoid to use inlined tags simply because my IDE shows some warnings..
2016-11-27 11 41 53

This seems a valid javadoc markdown but it seems any inlined tags are treated as backticts in scaladoc.

(I am sorry for messing around with many comments but this is also related with #16013 (comment))

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I will try to treat the comments below too but let me maybe try to reduce the usages of the inlined tags where possible.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

{@code} is valid in Javadoc; is it saying basically any {@foo } syntax is treated as back-tick-quoted by scaladoc? as it happens that's fine here right? can you just disable the warning in your IDE? If it produces the correct output then that's going to be a nicer rendering than breaking out new code blocks.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeap. I tested some cases with it (If I remember correctly) and it seems any {@foo } syntax is treated as back-tick-quoted by scaladoc (I will test more in case and will be back here to fix the comment if I was wrong). Sure, sounds great!

* {{{
* def printRDDElement(record:(String, Seq[String]), f:String=>Unit) =
* for (e <- record._2) {f(e)}
* }}}
Copy link
Member Author

@HyukjinKwon HyukjinKwon Nov 26, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • Scala
    2016-11-28 12 47 08

  • Java
    2016-11-28 12 46 45

@@ -162,6 +163,7 @@ object InputFormatInfo {
go to (a) until required nodes are allocated.

If a node 'dies', follow same procedure.
}}}
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • Java
    2016-11-27 3 17 20

Scaladoc not found.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If this is just to get around the ->, I would instead change that to "to" and leave this. Or, if you like, make it an HTML ordered list

* <script>
* }}}
*
* will cause in the whole description to be treated as plain text.
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • Java
    2016-11-27 3 19 01

Scaladoc not found

* according to semantics where
*
* {{{
* NaN == NaN and NaN > any non-NaN double
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • Java
    2016-11-27 3 21 11

Scaladoc not found

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just change this to "is greater than". It's not really code we're describing here but semantics of numeric ordering

@SparkQA
Copy link

SparkQA commented Nov 28, 2016

Test build #69219 has finished for PR 16013 at commit 29d65cc.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

Copy link
Member

@srowen srowen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a handful of additional changes, looking good

@@ -2063,6 +2063,7 @@ class SparkContext(config: SparkConf) extends Logging {
* @param jobId the job ID to cancel
* @throws InterruptedException if the cancel message cannot be sent
*/
@throws(classOf[InterruptedException])
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think these need to be reverted too; we don't want to introduce checked exceptions

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, actually this complains as below:

[error] .../java/org/apache/spark/SparkContext.java:1150: error: exception not thrown: java.lang.InterruptedException
[error]    * @throws InterruptedException if the cancel message cannot be sent
[error]              ^

Let me move this to @note.

@@ -42,8 +42,8 @@ class SlidingRDDPartition[T](val idx: Int, val prev: Partition, val tail: Seq[T]
* @param windowSize the window size, must be greater than 1
* @param step step size for windows
*
* @see [[org.apache.spark.mllib.rdd.RDDFunctions.sliding(Int, Int)*]]
* @see [[scala.collection.IterableLike.sliding(Int, Int)*]]
* @see `org.apache.spark.mllib.rdd.RDDFunctions.sliding(Int, Int)*`
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is the trailing * intentional or a typo? no big deal anyway

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let me please leave as they are. I am worried of getting blamed in the future. I will keep in mind that I should leave a comment for this when someone tries to change something around this.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Heh, OK. I don't think it has any meaning in a hyperlink or javadoc syntax, and this isn't either one anyway, but it's OK to leave it.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for bearing with me.

* of validation error is compared to relative tolerance which is
* validationTol * (current loss on the validation set).
* If the current loss on the validation set is <= 0.01, the diff
* of validation error is compared to absolute tolerance which is
* If the current loss on the validation set is less than or euqal to 0.01,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

typo: euqal -> equal

@SparkQA
Copy link

SparkQA commented Nov 28, 2016

Test build #69241 has finished for PR 16013 at commit 7d44dc5.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@@ -2061,9 +2061,8 @@ class SparkContext(config: SparkConf) extends Logging {
* Cancel a given job if it's scheduled or running.
*
* @param jobId the job ID to cancel
* @throws InterruptedException if the cancel message cannot be sent
* @note Throws `InterruptedException` if the cancel message cannot be sent
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hm, InterruptedException is in java.lang so I am surprised it isn't found. Does it help if you write @throws java.lang.InterruptedException? that's better if it works

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure, I will try!

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hm.. interesting. I haven't looked into this deeper but it seems it fails anyway.

[error] .../java/org/apache/spark/SparkContext.java:1150: error: exception not thrown: java.lang.InterruptedException
[error]    * @throws java.lang.InterruptedException if the cancel message cannot be sent
[error]              ^

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hm, so it's just complaining that it's documented as a checked exception but can't be thrown according to the byte code. It has a point there, but I am also kind of surprised it's an error. OK Leave it the way you have it as it seems like it's the only way that works.

Copy link
Member

@srowen srowen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great, looks like this gets us to compatibility with javadoc 8 and we should get this in 2.1.

@HyukjinKwon
Copy link
Member Author

Thank you so much @srowen.

asfgit pushed a commit that referenced this pull request Nov 29, 2016
…ibility in Java API documentation

## What changes were proposed in this pull request?

This PR make `sbt unidoc` complete with Java 8.

This PR roughly includes several fixes as below:

- Fix unrecognisable class and method links in javadoc by changing it from `[[..]]` to `` `...` ``

  ```diff
  - * A column that will be computed based on the data in a [[DataFrame]].
  + * A column that will be computed based on the data in a `DataFrame`.
  ```

- Fix throws annotations so that they are recognisable in javadoc

- Fix URL links to `<a href="http..."></a>`.

  ```diff
  - * [[http://en.wikipedia.org/wiki/Decision_tree_learning Decision tree]] model for regression.
  + * <a href="http://en.wikipedia.org/wiki/Decision_tree_learning">
  + * Decision tree (Wikipedia)</a> model for regression.
  ```

  ```diff
  -   * see http://en.wikipedia.org/wiki/Receiver_operating_characteristic
  +   * see <a href="http://en.wikipedia.org/wiki/Receiver_operating_characteristic">
  +   * Receiver operating characteristic (Wikipedia)</a>
  ```

- Fix < to > to

  - `greater than`/`greater than or equal to` or `less than`/`less than or equal to` where applicable.

  - Wrap it with `{{{...}}}` to print them in javadoc or use `{code ...}` or `{literal ..}`. Please refer #16013 (comment)

- Fix `</p>` complaint

## How was this patch tested?

Manually tested by `jekyll build` with Java 7 and 8

```
java version "1.7.0_80"
Java(TM) SE Runtime Environment (build 1.7.0_80-b15)
Java HotSpot(TM) 64-Bit Server VM (build 24.80-b11, mixed mode)
```

```
java version "1.8.0_45"
Java(TM) SE Runtime Environment (build 1.8.0_45-b14)
Java HotSpot(TM) 64-Bit Server VM (build 25.45-b02, mixed mode)
```

Author: hyukjinkwon <[email protected]>

Closes #16013 from HyukjinKwon/SPARK-3359-errors-more.

(cherry picked from commit f830bb9)
Signed-off-by: Sean Owen <[email protected]>
@srowen
Copy link
Member

srowen commented Nov 29, 2016

Merged to master/2.1

@asfgit asfgit closed this in f830bb9 Nov 29, 2016
robert3005 pushed a commit to palantir/spark that referenced this pull request Dec 2, 2016
…ibility in Java API documentation

## What changes were proposed in this pull request?

This PR make `sbt unidoc` complete with Java 8.

This PR roughly includes several fixes as below:

- Fix unrecognisable class and method links in javadoc by changing it from `[[..]]` to `` `...` ``

  ```diff
  - * A column that will be computed based on the data in a [[DataFrame]].
  + * A column that will be computed based on the data in a `DataFrame`.
  ```

- Fix throws annotations so that they are recognisable in javadoc

- Fix URL links to `<a href="http..."></a>`.

  ```diff
  - * [[http://en.wikipedia.org/wiki/Decision_tree_learning Decision tree]] model for regression.
  + * <a href="http://en.wikipedia.org/wiki/Decision_tree_learning">
  + * Decision tree (Wikipedia)</a> model for regression.
  ```

  ```diff
  -   * see http://en.wikipedia.org/wiki/Receiver_operating_characteristic
  +   * see <a href="http://en.wikipedia.org/wiki/Receiver_operating_characteristic">
  +   * Receiver operating characteristic (Wikipedia)</a>
  ```

- Fix < to > to

  - `greater than`/`greater than or equal to` or `less than`/`less than or equal to` where applicable.

  - Wrap it with `{{{...}}}` to print them in javadoc or use `{code ...}` or `{literal ..}`. Please refer apache#16013 (comment)

- Fix `</p>` complaint

## How was this patch tested?

Manually tested by `jekyll build` with Java 7 and 8

```
java version "1.7.0_80"
Java(TM) SE Runtime Environment (build 1.7.0_80-b15)
Java HotSpot(TM) 64-Bit Server VM (build 24.80-b11, mixed mode)
```

```
java version "1.8.0_45"
Java(TM) SE Runtime Environment (build 1.8.0_45-b14)
Java HotSpot(TM) 64-Bit Server VM (build 25.45-b02, mixed mode)
```

Author: hyukjinkwon <[email protected]>

Closes apache#16013 from HyukjinKwon/SPARK-3359-errors-more.
uzadude pushed a commit to uzadude/spark that referenced this pull request Jan 27, 2017
…ibility in Java API documentation

## What changes were proposed in this pull request?

This PR make `sbt unidoc` complete with Java 8.

This PR roughly includes several fixes as below:

- Fix unrecognisable class and method links in javadoc by changing it from `[[..]]` to `` `...` ``

  ```diff
  - * A column that will be computed based on the data in a [[DataFrame]].
  + * A column that will be computed based on the data in a `DataFrame`.
  ```

- Fix throws annotations so that they are recognisable in javadoc

- Fix URL links to `<a href="http..."></a>`.

  ```diff
  - * [[http://en.wikipedia.org/wiki/Decision_tree_learning Decision tree]] model for regression.
  + * <a href="http://en.wikipedia.org/wiki/Decision_tree_learning">
  + * Decision tree (Wikipedia)</a> model for regression.
  ```

  ```diff
  -   * see http://en.wikipedia.org/wiki/Receiver_operating_characteristic
  +   * see <a href="http://en.wikipedia.org/wiki/Receiver_operating_characteristic">
  +   * Receiver operating characteristic (Wikipedia)</a>
  ```

- Fix < to > to

  - `greater than`/`greater than or equal to` or `less than`/`less than or equal to` where applicable.

  - Wrap it with `{{{...}}}` to print them in javadoc or use `{code ...}` or `{literal ..}`. Please refer apache#16013 (comment)

- Fix `</p>` complaint

## How was this patch tested?

Manually tested by `jekyll build` with Java 7 and 8

```
java version "1.7.0_80"
Java(TM) SE Runtime Environment (build 1.7.0_80-b15)
Java HotSpot(TM) 64-Bit Server VM (build 24.80-b11, mixed mode)
```

```
java version "1.8.0_45"
Java(TM) SE Runtime Environment (build 1.8.0_45-b14)
Java HotSpot(TM) 64-Bit Server VM (build 25.45-b02, mixed mode)
```

Author: hyukjinkwon <[email protected]>

Closes apache#16013 from HyukjinKwon/SPARK-3359-errors-more.
asfgit pushed a commit that referenced this pull request Apr 12, 2017
## What changes were proposed in this pull request?

This PR proposes to run Spark unidoc to test Javadoc 8 build as Javadoc 8 is easily re-breakable.

There are several problems with it:

- It introduces little extra bit of time to run the tests. In my case, it took 1.5 mins more (`Elapsed :[94.8746569157]`). How it was tested is described in "How was this patch tested?".

- > One problem that I noticed was that Unidoc appeared to be processing test sources: if we can find a way to exclude those from being processed in the first place then that might significantly speed things up.

  (see  joshrosen's [comment](https://issues.apache.org/jira/browse/SPARK-18692?focusedCommentId=15947627&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-15947627))

To complete this automated build, It also suggests to fix existing Javadoc breaks / ones introduced by test codes as described above.

There fixes are similar instances that previously fixed. Please refer #15999 and #16013

Note that this only fixes **errors** not **warnings**. Please see my observation #17389 (comment) for spurious errors by warnings.

## How was this patch tested?

Manually via `jekyll build` for building tests. Also, tested via running `./dev/run-tests`.

This was tested via manually adding `time.time()` as below:

```diff
     profiles_and_goals = build_profiles + sbt_goals

     print("[info] Building Spark unidoc (w/Hive 1.2.1) using SBT with these arguments: ",
           " ".join(profiles_and_goals))

+    import time
+    st = time.time()
     exec_sbt(profiles_and_goals)
+    print("Elapsed :[%s]" % str(time.time() - st))
```

produces

```
...
========================================================================
Building Unidoc API Documentation
========================================================================
...
[info] Main Java API documentation successful.
...
Elapsed :[94.8746569157]
...

Author: hyukjinkwon <[email protected]>

Closes #17477 from HyukjinKwon/SPARK-18692.
@HyukjinKwon HyukjinKwon deleted the SPARK-3359-errors-more branch January 2, 2018 03:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants