As of Hue 3.8.1-1507 and Spark 1.3.1, you can configure Hue to use the Spark Notebook UI. This allows users to submit Spark jobs from Hue.
Complete the following steps as the
root user or by using
- Install the
mapr-hue-livypackage on the node were you have installed the
mapr-sparkpackage and configured Spark.
On RedHat/ CentOS:
For Spark 1.3.1: Copy
javax.servlet-api-3.1.0.jarto the spark lib directory.
In the spark-env.sh file, configure SPARK_SUBMIT_CLASSPATH environment variable to include the classpath to the servlet jar before the MAPR_SPARK_CLASSPATH.
[spark]section of the hue.ini, set the
livy_server_hostparameter to the host where the Livy server is running.
If Spark jobs run on YARN, perform the following steps:
yarnon the node where the Livy server is running.
For Hue 3.9.0: Set the HUE_HOME and the HADOOP_CONF_DIR environment variables in the hue.sh file (/opt/mapr/hue/hue-<version>/bin/hue.sh).
Restart the Spark REST Job Server (Livy).