sqoop 1.99 集群安装及迁移mysql 到hive

一.准备工作

1.hadoop,hive,hbase 集群安装

HADOOP_HOME=/soft/hadoop/hadoop-2.9.2
HBASE_HOME=/soft/hbase/hbase-2.1.6
HIVE_HOME=/soft/hive/apache-hive-2.3.6-bin
SQOOP_HOME=/soft/sqoop/sqoop-1.99.7-bin-hadoop200
JAVA_HOME=/soft/jdk/jdk1.8.0_211
export HADOOP_COMMON_HOME=$HADOOP_HOME/share/hadoop/common
export HADOOP_HDFS_HOME=$HADOOP_HOME/share/hadoop/hdfs
export HADOOP_MAPRED_HOME=$HADOOP_HOME/share/hadoop/mapreduce
export HADOOP_YARN_HOME=$HADOOP_HOME/share/hadoop/yarn

PATH=$PATH:$HOME/bin:$HADOOP_HOME/bin:$HBASE_HOME/bin:$HIVE_HOME/bin:$SQOOP_HOME/bin:$JAVA_HOME/bin

export SQOOP_SERVER_EXTRA_LIB=$SQOOP_HOME/extra

 

 

2.hadoop core.site 添加

<property>
<name>hadoop.proxyuser.$SERVER_USER.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.$SERVER_USER.groups</name>
<value>*</value>
</property>

3.拷贝mysql 驱动包到sqoop_home/extra目录下

4.sqoop_bootstrap.properties和sqoop.properties配置

sqoop_bootstrap.properties

sqoop.config.provider=org.apache.sqoop.core.PropertiesConfigurationProvider

sqoop.properties

org.apache.sqoop.submission.engine.mapreduce.configuration.directory=/soft/hadoop/hadoop-2.9.2/etc/hadoop
org.apache.sqoop.security.authentication.type=SIMPLE
org.apache.sqoop.security.authentication.handler=org.apache.sqoop.security.authentication.SimpleAuthenticationHandler
org.apache.sqoop.security.authentication.anonymous=true

5.验证

sqoop2-tool verify

6.启动

sqoop2-server start

 

 

 

 

 

 

上一篇:mysql-sqoop merge-key创建多个零件文件,而不是一个不能使用merge-key的文件


下一篇:Java-从MySQL到Hive导入,其中MySQL在Windows上运行,而Hive在Cent OS上运行(Horton Sandbox)