Installing Spark on YARN

This topic includes instructions for using package managers to download and install Spark on YARN from the MEP repository.

For instructions on setting up the MEP repository, see Step 8: Install Ecosystem Components Manually.

Spark is distributed as two separate packages:

Package Description
mapr-spark Install this package on each node where you want to install Spark. This package is dependent on the mapr-client package.
mapr-spark-historyserver Install this optional package on Spark History Server nodes. This package is dependent on the mapr-spark package and mapr-core package.

To install Spark on YARN (Hadoop 2), execute the following commands as root or using sudo:

  1. Verify that JDK 1.7 or later is installed on the node where you want to install Spark.
  2. Create the /apps/spark directory on MapR-FS, and set the correct permissions on the directory:
    hadoop fs -mkdir /apps/spark
    hadoop fs -chmod 777 /apps/spark
  3. Install the packages:
    On Ubuntu
    apt-get install mapr-spark mapr-spark-historyserver
    On RedHat / CentOS
    yum install mapr-spark mapr-spark-historyserver
    On SUSE
    zypper install mapr-spark mapr-spark-historyserver
    Note: The mapr-spark-historyserver package is optional.
  4. If you want to integrate Spark with MapR Streams, install the Streams Client on each Spark node:
    • On Ubuntu:
       apt-get install mapr-kafka
    • On RedHat/CentOS:
      yum install mapr-kafka
  5. If you want to use a Streaming Producer, add the spark-streaming-kafka-producer_2.11.jar from the MapR Maven repository to the Spark classpath (/opt/mapr/spark/spar-<versions>/jars/).
  6. To test the installation, run the following command as the mapr user:
    • On Spark 2.0.1:
      /opt/mapr/spark/spark-<version>/bin/run-example --master yarn --deploy-mode client SparkPi 10
    • On Spark 1.6.1
      MASTER=yarn-client /opt/mapr/spark/spark-<version>/bin/run-example org.apache.spark.examples.SparkPi 10

    This command will fail if it is run as the root user.