所需软件及下载地址 1、mysql或mariadb mysql下载地址 https://dev.mysql.com/downloads/mysql/5.5.html#downloads 或用mariadb也可以,直接yum安装 yum -y install mariadb-server 2、hive http://mirrors.hust.edu.cn/apache/hive/ 3、mysql-connector-java https://dev.mysql.com/downloads/connector/j/ 一、安装hadoop和mysql,这里略过 注意mysql编码要设置为Latin1,因为hive不支持utf8 在mysql中创建数据库和用户

create database hive;
grant all on hive.* to hive@'%'  identified by 'hive';
grant all on hive.* to hive@'localhost'  identified by 'hive';
flush privileges; 

二、开始安装hive 1、解压

[root@master ~]# tar xf apache-hive-1.2.2-bin.tar.gz -C /opt/

2、修改环境变量

[root@master ]# vi /etc/profile
export HIVE_HOME=/opt/hive
export PATH=$PATH:$HIVE_HOME/bin
[root@master ]# source  /etc/profile

3、修改hive-site配置,主要修改以下参数(注,\ 是转义符,因为<>显示不出)

[root@master ]# cd /opt/hive/conf
[root@master conf]# cp hive-default.xml.template hive-site.xml
[root@master conf]# vi hive-site.xml 
<\property>
   <\name>javax.jdo.option.ConnectionURL <\/name> 
   <\value>jdbc:mysql://localhost:3306/hive <\/value> 
<\/property> 
 
<\property> 
   <\name>javax.jdo.option.ConnectionDriverName <\/name> 
   <\value>com.mysql.jdbc.Driver <\/value> 
<\/property>

<\property> 
   <\name>javax.jdo.option.ConnectionPassword <\/name> 
   <\value>hive <\/value> 
<\/property> 
 
<\property> 
   <\name>hive.hwi.listen.port <\/name> 
   <\value>9999 <\/value> 
   <\description>This is the port the Hive Web Interface will listen on <\/descript ion> 
<\/property> 

<\property> 
   <\name>datanucleus.autoCreateSchema <\/name> 
   <\value>true<\/value> 
<\/property> 
 
<\property> 
   <\name>datanucleus.fixedDatastore <\/name> 
   <\value>false<\/value> 
<\/property> 

  <\property>
    <\name>javax.jdo.option.ConnectionUserName<\/name>
    <\value>hive<\/value>
    <\description>Username to use against metastore database<\/description>
  <\/property>

  <\property>
    <\name>hive.exec.local.scratchdir<\/name>
    <\value>/opt/hive/tmp<\/value>
    <\description>Local scratch space for Hive jobs<\/description>
  <\/property>

  <\property>
    <\name>hive.downloaded.resources.dir<\/name>
    <\value>/opt/hive/tmp<\/value>
    <\description>Temporary local directory for added resources in the remote file system.<\/description>
  <\/property>
	
  <\property>
    <\name>hive.querylog.location<\/name>
    <\value>/opt/hive/tmp<\/value>
    <\description>Location of Hive run time structured log file<\/description>
  <\/property>

4、拷贝mysql-connector-java-5.1.43-bin.jar 到hive 的lib下面

[root@master ~]# mv mysql-connector-java-5.1.43-bin.jar  /opt/hive/lib/

5、把jline-2.12.jar拷贝到hadoop相应的目录下,否则启动会报错

[root@master ~]# cd /opt/hive/lib
[root@master lib]# cp jline-2.12.jar /opt/hadoop-2.8.1/share/hadoop/yarn/lib/

6、创建hive临时文件夹

mkdir /opt/hive/tmp

三、启动hive 1、启动hadoop后,执行hive命令 [root@master ~]#hive 2、测试

hive> show databases;
OK
default
Time taken: 0.907 seconds, Fetched: 1 row(s)
hive --service metastore

出现此提示说明安装成功 3、创建表

hive> CREATE TABLE csv_t1(index_code string,name string)
ROW FORMAT SERDE 
'org.apache.hadoop.hive.serde2.OpenCSVSerde' 
WITH SERDEPROPERTIES (  
"separatorChar" = ",",                 #以逗号为分隔符
"quoteChar"     = "'",                   #以单引号为分隔符
"escapeChar"    = "\\")                #以双斜杠为分隔符
STORED AS TEXTFILE; 

4、导入数据

hive> load data local inpath "/usr/local/test.csv" into csv_t1;

四、报错 1、mysql编码不是latin1

FAILED: Execution Error, return code 1 from 
org.apache.Hadoop.hive.ql.exec.DDLTask. MetaException
(message:javax.jdo.JDODataStoreException: An exception was 
thrown while adding/validating class(es) : Specified key was too 
long; max key length is 767 bytes
com.MySQL.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: 
Specified key was too long; max key length is 767 bytes

解决办法

mysql > alter database hive character set latin1;

2、READ-COMMITTED需要把bin-log以mixed方式来记录 否则进入hive,会如下错误

FAILED: Error in metadata: javax.jdo.JDOException: Couldnt 
obtain a new sequence (unique id) : Binary logging not possible. 
Message: Transaction level 'READ-COMMITTED' in InnoDB is not 
safe for binlog mode 'STATEMENT' 

解决办法

mysql >set global binlog_format='MIXED';