Run Spark from the Spark Shell

In yarn-client mode, complete the following steps to run Spark from the Spark shell:
  1. Navigate to the Spark-on-YARN installation directory, and insert your Spark version into the command. For example: 2.0.1
    cd /opt/mapr/spark/spark-<version>/
  2. Issue the following command to run Spark from the Spark shell:
    • On Spark 2.0.1 and later:
      ./bin/spark-shell --master yarn --deploy-mode client
    • On Spark 1.6.1:
      MASTER=yarn-client ./bin/spark-shell
    Note: You must use yarn-client mode to run Spark from the Spark shell. The yarn-cluster mode is not supported.