You can install directly from a GitHub repository:
if (!require('devtools')) install.packages('devtools')
devtools::install_github('apache/[email protected]', subdir='R/pkg')
You should choose tag (v2.x.x
above) corresponding to the version of Spark you use. You can find a full list of tags on the project page or directly from R using GitHub API:
jsonlite::fromJSON("https://api.github.com/repos/apache/spark/tags")$name
If you've downloaded binary package from a downloads page R library is in a R/lib/SparkR
subdirectory. It can be used to install SparkR
directly. For example:
$ export SPARK_HOME=/path/to/spark/directory
$ cd $SPARK_HOME/R/pkg/
$ R -e "devtools::install('.')"
You can also add R lib to .libPaths
(taken from here):
Sys.setenv(SPARK_HOME='/path/to/spark/directory')
.libPaths(c(file.path(Sys.getenv('SPARK_HOME'), 'R', 'lib'), .libPaths()))
Finally, you can use sparkR
shell without any additional steps:
$ /path/to/spark/directory/bin/sparkR
Edit
According to Spark 2.1.0 Release Notes should be available on CRAN in the future:
Standalone installable package built with the Apache Spark release. We will be submitting this to CRAN soon.
You can follow SPARK-15799 to check the progress.
Edit 2
While SPARK-15799 has been merged, satisfying CRAN requirements proved to be challenging (see for example discussions about 2.2.2, 2.3.1, 2.4.0), and the packages has been subsequently removed (see for example SparkR was removed from CRAN on 2018-05-01, CRAN SparkR package removed?). As the result methods listed in the original post are still the most reliable solutions.
Edit 3
OK, SparkR
is back up on CRAN again, v2.4.1. install.packages('SparkR')
should work again (it may take a couple of days for the mirrors to reflect this)
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…