To build SparkR on Windows, the following steps are required
- Install R (>= 3.1) and Rtools. Make sure to
include Rtools and R in
PATH
. - Install
JDK7 and set
JAVA_HOME
in the system environment variables. - Download and install Maven. Also include the
bin
directory in Maven inPATH
. - Set
MAVEN_OPTS
as described in Building Spark. - Open a command shell (
cmd
) in the Spark directory and runmvn -DskipTests -Psparkr package
To run the SparkR unit tests on Windows, the following steps are required —assuming you are in the Spark root directory and do not have Apache Hadoop installed already:
-
Create a folder to download Hadoop related files for Windows. For example,
cd ..
andmkdir hadoop
. -
Download the relevant Hadoop bin package from steveloughran/winutils. While these are not official ASF artifacts, they are built from the ASF release git hashes by a Hadoop PMC member on a dedicated Windows VM. For further reading, consult Windows Problems on the Hadoop wiki.
-
Install the files into
hadoop\bin
; make sure thatwinutils.exe
andhadoop.dll
are present. -
Set the environment variable
HADOOP_HOME
to the full path to the newly createdhadoop
directory. -
Run unit tests for SparkR by running the command below. You need to install the testthat package first:
R -e "install.packages('testthat', repos='http://cran.us.r-project.org')" .\bin\spark-submit2.cmd --conf spark.hadoop.fs.default.name="file:///" R\pkg\tests\run-all.R