我当前的/etc/profile文件配置的环境变量
export JAVA_HOME=/opt/java/jdk1.8.0_131 export SPARK_HOME=/opt/spark-2.4.4-bin-hadoop2.7 export HIVE_HOME=/usr/hdp/current/hive-client export LIVY_HOME=/opt/livy/livy-0.5.0-incubating-bin export HBASE_HOME=/opt/hbase-2.2.1 export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar export HADOOP_HOME=/opt/hadoop-2.7.7 export HADOOP_COMMON_HOME=$HADOOP_HOME export HADOOP_MAPRED_HOME=$HADOOP_HOME export HADOOP_YARN_HOME=$HADOOP_HOME export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop export HADOOP_USER_NAME=hdfs export HADOOP_HDFS_HOME=$HADOOP_HOME export HADOOP_HDFS_HOME=$HADOOP_HOME export PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/root/bin:/usr/local/go/bin export PATH=$JAVA_HOME/bin:$HADOOP_HOME/bin:$SPARK_HOME/bin:$LIVY_HOME/bin:$PATH
当配置完环境变量于是我准备开始测试一下,于是遇到了如下报错 hadoop fs -ls / 查看hdfs文件
#执行hadoop classpath 命令会显示当前hadoop都引入了那些jar包
解决方法如下 libexec/hadoop-config.sh 这个文件里最后一行
添加CLASSPATH=${CLASSPATH}:'/opt/hadoop-2.7.7/client/*
再次查看hadoop fs -ls /