大数据

hadoop-3.0.0-alpha2部署apache-hive-2.1.1-bin不兼容

时间:2017-6-26 16:21:20  作者:solgle  来源:solgle.com  查看:2999  评论:0
内容摘要:MySQL安装在master机器上,hive服务器也安装在master上download website: https://mirrors.cnnic.cn/apache/hive[hodp@nameNode u01]$ tar -vxf apache-hive-2.1.1-bi...
MySQL安装在master机器上,hive服务器也安装在master上
 
download website: https://mirrors.cnnic.cn/apache/hive
 
 
[hodp@nameNode u01]$ tar -vxf apache-hive-2.1.1-bin.tar.gz -C  ./hive/
[hodp@nameNode hive]$ ls
apache-hive-2.1.1-bin
[hodp@nameNode hive]$ 
 
 
 
--配置环境变量
[root@nameNode ~]# vi /etc/profile
 
export HIVE_HOME=/u01/hive/apache-hive-2.1.1-bin
export PATH=$PATH:$HIVE_HOME/bin
 
 
[hodp@nameNode hive]$ source /etc/profile
[hodp@nameNode hive]$ 
 
--mysql安装
[root@nameNode ~]# groupadd mysql
[root@nameNode ~]# useradd mysql -g  mysql
 
---安保mysql
[root@nameNode Packages]# yum install mysql-server-5.1.73-5.el6_6.x86_64.rpm 
[root@nameNode Packages]# yum install mysql-devel-5.1.73-5.el6_6.x86_64.rpm 
 
省略...
-----------------------------------------------------------------------------
 
创建Hive用户
[hodp@nameNode bin]$ mysql -h 192.168.146.153 -u root -p
123Enter password: 
Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 9
Server version: 5.1.73 Source distribution
 
Copyright (c) 2000, 2013, Oracle and/or its affiliates. All rights reserved.
 
Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.
 
Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.
mysql> 
 
mysql> create user hive identified by 'hive';
Query OK, 0 rows affected (0.26 sec)
 
mysql> grant all privileges on *.* to 'hive'@'%' with grant option;
Query OK, 0 rows affected (0.00 sec)
 
mysql> flush privileges;
Query OK, 0 rows affected (0.07 sec)
 
mysql> 
 
 
Hive用户登录
[hadoop@hadoop-master ~]mysql -h 192.168.146.153 -u hive -p
 
创建Hive数据库
mysql>create database hive;
 
配置hive配置文件
[hodp@nameNode conf]$ pwd
/u01/hive/apache-hive-2.1.1-bin/conf
[hodp@nameNode conf]$ 
---该目录再创建hive-site.xml
[hodp@nameNode conf]$ vi hive-site.xml 
<configuration>
    <property>
        <name>javax.jdo.option.ConnectionURL</name>
        <value>jdbc:mysql://hadoop-master:3306/hive?createDatabaseIfNotExist=true</value>
        <description>JDBC connect string for a JDBC metastore</description>    
    </property>   
    <property> 
        <name>javax.jdo.option.ConnectionDriverName</name> 
        <value>com.mysql.jdbc.Driver</value> 
        <description>Driver class name for a JDBC metastore</description>     
    </property>               
 
    <property> 
        <name>javax.jdo.option.ConnectionUserName</name>
        <value>hive</value>
        <description>username to use against metastore database</description>
    </property>
    <property>  
        <name>javax.jdo.option.ConnectionPassword</name>
        <value>hive</value>
        <description>password to use against metastore database</description>  
    </property>          
</configuration>
 
--下载及安装mysql jdbc驱动
[hodp@nameNode ~]$ cp mysql-connector-java-commercial-5.1.25-bin.jar /u01/hive/apache-hive-2.1.1-bin/lib
[hodp@nameNode ~]$ 
 
--配置hive客户端
[hodp@nameNode u01]$ scp -r ./hive/  hodp@dataNode1:/u01/
 
[hodp@nameNode u01]$ scp -r ./hive/  hodp@dataNode2:/u01/
 
每个datanode配置hive-site.xml 
<configuration>
    <property>  
        <name>hive.metastore.uris</name>  
    <value>thrift://hadoop-master:9083</value>  
    </property>
</configuration>
 
 
---启动测试
[hodp@nameNode conf]$ hive --service metastore &
[1] 5939
[hodp@nameNode conf]$ Cannot find hadoop installation: $HADOOP_HOME or $HADOOP_PREFIX must be set or hadoop must be in the path
 
---调整环境变量
PATH=$PATH:$HOME/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:$HADOOP_HOME/bin
 
 
export PATH
 
 
---再次测试
[hodp@nameNode conf]$ hive --service metastore &
[1] 3029
[hodp@nameNode conf]$ 
[hodp@nameNode conf]$ which: no hbase in (/u01/jdk1.8.0_131/bin:/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:/u01/hive/apache-hive-2.1.1-bin/bin:/home/hodp/bin:/home/hodp/bin:/u01/hadoop-3.0.0-alpha2//bin:/home/hodp/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/u01/hadoop-3.0.0-alpha2//bin)
Starting Hive Metastore Server
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/u01/hive/apache-hive-2.1.1-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/u01/hadoop-3.0.0-alpha2/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
[hodp@nameNode conf]$ Exception in thread "main" java.lang.IllegalArgumentException: Unrecognized Hadoop major version number: 3.0.0-alpha2
at org.apache.hadoop.hive.shims.ShimLoader.getMajorVersion(ShimLoader.java:169)
at org.apache.hadoop.hive.shims.ShimLoader.loadShims(ShimLoader.java:136)
at org.apache.hadoop.hive.shims.ShimLoader.getHadoopThriftAuthBridge(ShimLoader.java:122)
at org.apache.hadoop.hive.metastore.HiveMetaStore.main(HiveMetaStore.java:6664)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:239)
at org.apache.hadoop.util.RunJar.main(RunJar.java:153)
 
---结果版本不兼容,这个hive目前已经是最高版本了。有人建议用cdh
 
 
标签:hive不兼容 

solgle.com 版权所有,欢迎分享!!!

相关文章
    相关评论
       Copyright © 2013-2020 solgle.com,All rights reserved.[solgle.com] 公安机关备案号:51010802000219
    Email:solgle@solgle.com; weixin:cd1008610000 ICP:蜀ICP备14011070号-1