1、启动yarn时报Error: JAVA_HOME is not set and could not be found.

修改/etc/hadoop/hadoop-env.sh中设JAVA_HOME,应当使用绝对路径。

export JAVA_HOME=$JAVA_HOME                  //错误,不能这么改

export JAVA_HOME=/usr/java/jdk1.6.0_45        //正确,应该这么改


2、安装完启动hadoop的时候出现如下警告信息

2016-05-20 18:45:11,022 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 50070
2016-05-20 18:45:11,027 INFO org.mortbay.log: jetty-6.1.26
2016-05-20 18:45:11,648 INFO org.mortbay.log: Started HttpServer2$SelectChannelConnectorWithSafeStartup@0.0.0.0:50070
2016-05-20 18:45:11,725 

2016-05-20 18:45:11,833 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: No KeyProvider found.
2016-05-20 18:45:11,849 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsLock is fair:true
2016-05-20 18:45:11,899 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager: dfs.block.invalidate.limit=1000
2016-05-20 18:45:11,900 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager: dfs.namenode.datanode.registration.ip-hostname-check=true
2016-05-20 18:45:11,904 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
2016-05-20 18:45:11,907 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: The block deletion will start around 2016 五月 20 18:45:11
2016-05-20 18:45:11,914 INFO org.apache.hadoop.util.GSet: Computing capacity for map BlocksMap
2016-05-20 18:45:11,914 INFO org.apache.hadoop.util.GSet: VM type       = 64-bit
2016-05-20 18:45:11,919 INFO org.apache.hadoop.util.GSet: 2.0% max memory 966.7 MB = 19.3 MB

问题原因及解决方法

改警告信息主要是因为做配置文件hdfs-site.xml中dfs.namenode.name.dir参数的路劲前缀未加上file://,如:

<configuration>

<property>

<name>dfs.replication</name>

<value>2</value>

</property>

<property>

<name>dfs.namenode.name.dir</name>

<value>file:////hadoop/dfs/name</value>

</property>

<property>

<name>dfs.datanode.data.dir</name>

<value>file:////hadoop/dfs/data</value>

</property>

</configuration>


3、hadoop执行任意命令对HDFS操作报错如下:

WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

问题原因及解决:
改警告主要是因为做hadoop安装目录/hadoop/lib/native/下缺少libhadoop.so或者libhadoop.so的版本(32/64bit)与操作系统的版本不一致导致的,只需下载hadoop-native-64-2.6.0解压到/hadoop/lib/native/目录下,或者替换原来的即可