Skip to content

Commit a85fb6c

Browse files
hvanhovellsrowen
authored andcommittedAug 15, 2015
[SPARK-9980] [BUILD] Fix SBT publishLocal error due to invalid characters in doc
Tiny modification to a few comments ```sbt publishLocal``` work again. Author: Herman van Hovell <[email protected]> Closes apache#8209 from hvanhovell/SPARK-9980.
1 parent 7c1e568 commit a85fb6c

File tree

9 files changed

+19
-19
lines changed

9 files changed

+19
-19
lines changed
 

‎core/src/main/java/org/apache/spark/unsafe/map/BytesToBytesMap.java

+3-3
Original file line numberDiff line numberDiff line change
@@ -92,9 +92,9 @@ public final class BytesToBytesMap {
9292

9393
/**
9494
* The maximum number of keys that BytesToBytesMap supports. The hash table has to be
95-
* power-of-2-sized and its backing Java array can contain at most (1 << 30) elements, since
96-
* that's the largest power-of-2 that's less than Integer.MAX_VALUE. We need two long array
97-
* entries per key, giving us a maximum capacity of (1 << 29).
95+
* power-of-2-sized and its backing Java array can contain at most (1 &lt;&lt; 30) elements,
96+
* since that's the largest power-of-2 that's less than Integer.MAX_VALUE. We need two long array
97+
* entries per key, giving us a maximum capacity of (1 &lt;&lt; 29).
9898
*/
9999
@VisibleForTesting
100100
static final int MAX_CAPACITY = (1 << 29);

‎examples/src/main/java/org/apache/spark/examples/ml/JavaDeveloperApiExample.java

+2-2
Original file line numberDiff line numberDiff line change
@@ -124,7 +124,7 @@ public String uid() {
124124

125125
/**
126126
* Param for max number of iterations
127-
* <p/>
127+
* <p>
128128
* NOTE: The usual way to add a parameter to a model or algorithm is to include:
129129
* - val myParamName: ParamType
130130
* - def getMyParamName
@@ -222,7 +222,7 @@ public Vector predictRaw(Vector features) {
222222
/**
223223
* Create a copy of the model.
224224
* The copy is shallow, except for the embedded paramMap, which gets a deep copy.
225-
* <p/>
225+
* <p>
226226
* This is used for the defaul implementation of [[transform()]].
227227
*
228228
* In Java, we have to make this method public since Java does not understand Scala's protected

‎examples/src/main/java/org/apache/spark/examples/streaming/JavaStatefulNetworkWordCount.java

+1-1
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@
4545
* Usage: JavaStatefulNetworkWordCount <hostname> <port>
4646
* <hostname> and <port> describe the TCP server that Spark Streaming would connect to receive
4747
* data.
48-
* <p/>
48+
* <p>
4949
* To run this on your local machine, you need to first run a Netcat server
5050
* `$ nc -lk 9999`
5151
* and then run the example

‎launcher/src/main/java/org/apache/spark/launcher/Main.java

+2-2
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ class Main {
3232

3333
/**
3434
* Usage: Main [class] [class args]
35-
* <p/>
35+
* <p>
3636
* This CLI works in two different modes:
3737
* <ul>
3838
* <li>"spark-submit": if <i>class</i> is "org.apache.spark.deploy.SparkSubmit", the
@@ -42,7 +42,7 @@ class Main {
4242
*
4343
* This class works in tandem with the "bin/spark-class" script on Unix-like systems, and
4444
* "bin/spark-class2.cmd" batch script on Windows to execute the final command.
45-
* <p/>
45+
* <p>
4646
* On Unix-like systems, the output is a list of command arguments, separated by the NULL
4747
* character. On Windows, the output is a command line suitable for direct execution from the
4848
* script.

‎launcher/src/main/java/org/apache/spark/launcher/SparkClassCommandBuilder.java

+1-1
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@
2828

2929
/**
3030
* Command builder for internal Spark classes.
31-
* <p/>
31+
* <p>
3232
* This class handles building the command to launch all internal Spark classes except for
3333
* SparkSubmit (which is handled by {@link SparkSubmitCommandBuilder} class.
3434
*/

‎launcher/src/main/java/org/apache/spark/launcher/SparkLauncher.java

+3-3
Original file line numberDiff line numberDiff line change
@@ -193,7 +193,7 @@ public SparkLauncher setMainClass(String mainClass) {
193193
* Adds a no-value argument to the Spark invocation. If the argument is known, this method
194194
* validates whether the argument is indeed a no-value argument, and throws an exception
195195
* otherwise.
196-
* <p/>
196+
* <p>
197197
* Use this method with caution. It is possible to create an invalid Spark command by passing
198198
* unknown arguments to this method, since those are allowed for forward compatibility.
199199
*
@@ -211,10 +211,10 @@ public SparkLauncher addSparkArg(String arg) {
211211
* Adds an argument with a value to the Spark invocation. If the argument name corresponds to
212212
* a known argument, the code validates that the argument actually expects a value, and throws
213213
* an exception otherwise.
214-
* <p/>
214+
* <p>
215215
* It is safe to add arguments modified by other methods in this class (such as
216216
* {@link #setMaster(String)} - the last invocation will be the one to take effect.
217-
* <p/>
217+
* <p>
218218
* Use this method with caution. It is possible to create an invalid Spark command by passing
219219
* unknown arguments to this method, since those are allowed for forward compatibility.
220220
*

‎launcher/src/main/java/org/apache/spark/launcher/SparkSubmitCommandBuilder.java

+2-2
Original file line numberDiff line numberDiff line change
@@ -25,11 +25,11 @@
2525

2626
/**
2727
* Special command builder for handling a CLI invocation of SparkSubmit.
28-
* <p/>
28+
* <p>
2929
* This builder adds command line parsing compatible with SparkSubmit. It handles setting
3030
* driver-side options and special parsing behavior needed for the special-casing certain internal
3131
* Spark applications.
32-
* <p/>
32+
* <p>
3333
* This class has also some special features to aid launching pyspark.
3434
*/
3535
class SparkSubmitCommandBuilder extends AbstractCommandBuilder {

‎launcher/src/main/java/org/apache/spark/launcher/SparkSubmitOptionParser.java

+4-4
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@
2323

2424
/**
2525
* Parser for spark-submit command line options.
26-
* <p/>
26+
* <p>
2727
* This class encapsulates the parsing code for spark-submit command line options, so that there
2828
* is a single list of options that needs to be maintained (well, sort of, but it makes it harder
2929
* to break things).
@@ -80,10 +80,10 @@ class SparkSubmitOptionParser {
8080
* This is the canonical list of spark-submit options. Each entry in the array contains the
8181
* different aliases for the same option; the first element of each entry is the "official"
8282
* name of the option, passed to {@link #handle(String, String)}.
83-
* <p/>
83+
* <p>
8484
* Options not listed here nor in the "switch" list below will result in a call to
8585
* {@link $#handleUnknown(String)}.
86-
* <p/>
86+
* <p>
8787
* These two arrays are visible for tests.
8888
*/
8989
final String[][] opts = {
@@ -130,7 +130,7 @@ class SparkSubmitOptionParser {
130130

131131
/**
132132
* Parse a list of spark-submit command line options.
133-
* <p/>
133+
* <p>
134134
* See SparkSubmitArguments.scala for a more formal description of available options.
135135
*
136136
* @throws IllegalArgumentException If an error is found during parsing.

‎unsafe/src/main/java/org/apache/spark/unsafe/memory/TaskMemoryManager.java

+1-1
Original file line numberDiff line numberDiff line change
@@ -60,7 +60,7 @@ public class TaskMemoryManager {
6060

6161
/**
6262
* Maximum supported data page size (in bytes). In principle, the maximum addressable page size is
63-
* (1L << OFFSET_BITS) bytes, which is 2+ petabytes. However, the on-heap allocator's maximum page
63+
* (1L &lt;&lt; OFFSET_BITS) bytes, which is 2+ petabytes. However, the on-heap allocator's maximum page
6464
* size is limited by the maximum amount of data that can be stored in a long[] array, which is
6565
* (2^32 - 1) * 8 bytes (or 16 gigabytes). Therefore, we cap this at 16 gigabytes.
6666
*/

0 commit comments

Comments
 (0)
Please sign in to comment.