一、配置环境变量:

export SPARK_HOME=/usr/hdp/2.2.8.0-3150/spark-1.6.1-bin-hadoop2.6

PATH=$PATH:${SPARK_HOME}/bin

export PATH


二、配置环境:

vi ./conf/spark-env.sh

export JAVA_HOME=/usr/java/jdk1.8.0_73

export SPARK_MASTER_IP=192.168.1.101

export SPARK_WORKER_CORES=2    #当前节点的cpu使用个数

export SPARK_WORKER_MEMORY=1g   #当前节点的内存

export HADOOP_CONF_DIR=/usr/hdp/2.2.8.0-3150/hadoop/etc/hadoop


三、配置节点地址:


vi ./conf/slaves


添加:


192.168.1.101

192.168.1.102

192.168.1.103


四、启动查看:

sbin/start-all.sh


集群页面:192.168.1.103:8080

各个节点的页面地址:192.168.1.101:8081,192.168.1.102:8081,192.168.1.103:8081