<ruby id="bdb3f"></ruby>

    <p id="bdb3f"><cite id="bdb3f"></cite></p>

      <p id="bdb3f"><cite id="bdb3f"><th id="bdb3f"></th></cite></p><p id="bdb3f"></p>
        <p id="bdb3f"><cite id="bdb3f"></cite></p>

          <pre id="bdb3f"></pre>
          <pre id="bdb3f"><del id="bdb3f"><thead id="bdb3f"></thead></del></pre>

          <ruby id="bdb3f"><mark id="bdb3f"></mark></ruby><ruby id="bdb3f"></ruby>
          <pre id="bdb3f"><pre id="bdb3f"><mark id="bdb3f"></mark></pre></pre><output id="bdb3f"></output><p id="bdb3f"></p><p id="bdb3f"></p>

          <pre id="bdb3f"><del id="bdb3f"><progress id="bdb3f"></progress></del></pre>

                <ruby id="bdb3f"></ruby>

                ??碼云GVP開源項目 12k star Uniapp+ElementUI 功能強大 支持多語言、二開方便! 廣告
                **一、前提條件** 安裝Sqoop的前提是已經具備Java和Hadoop、Zookeeper、MySQL、Hive、Hbase的環境。其它環境可以不啟動,但需要啟動MySQL。<br/> **二、下載并解壓** 1. 上傳安裝包sqoop-1.4.6-cdh5.14.2.tar.gz到虛擬機中 2. 解壓sqoop安裝包到指定目錄 ```sql [root@hadoop101 software]# tar -zxf sqoop-1.4.6-cdh5.14.2.tar.gz -C /opt/install/ ``` 3. 創建軟連接 ```sql [root@hadoop101 software]# ln -s /opt/install/sqoop-1.4.6-cdh5.14.2/ /opt/install/sqoop ``` 4. 配置環境變量 ```sql [root@hadoop101 install]# vim /etc/profile -- 添加如下內容: export SQOOP_HOME=/opt/install/sqoop export PATH=$SQOOP_HOME/bin:$PATH -- 讓配置文件生效。 [root@hadoop101 install]# source /etc/profile ``` <br/> **三、修改配置文件** Sqoop的配置文件與大多數大數據框架類似,在sqoop根目錄下的conf目錄中。 1. 重命名配置文件 ```sql [root@hadoop101 conf]# mv sqoop-env-template.sh sqoop-env.sh ``` 2. 在{sqoop_home}/conf/sqoop-env.sh中指定hadoop、hive、zookeeper、hbase環境 ```sql [root@hadoop101 conf]# vim sqoop-env.sh -- 添加下面內容 export HADOOP_COMMON_HOME=/opt/install/hadoop export HADOOP_MAPRED_HOME=/opt/install/hadoop export HIVE_HOME=/opt/install/hive export ZOOKEEPER_HOME=/opt/install/zookeeper export ZOOCFGDIR=/opt/install/zookeeper/conf export HBASE_HOME=/opt/install/hbase ``` <br/> **四、拷貝JDBC驅動** 拷貝jdbc驅動到sqoop的lib目錄下,驅動包可以到Maven倉庫下載:https://mvnrepository.com/artifact/mysql/mysql-connector-java ```sql [root@hadoop101 software]# cp mysql-connector-java-5.1.27.jar /opt/install/sqoop/lib/ ``` <br/> **五、驗證Sqoop** ```sql [root@hadoop101 lib]# sqoop help Warning: /opt/install/sqoop/../hcatalog does not exist! HCatalog jobs will fail. Please set $HCAT_HOME to the root of your HCatalog installation. Warning: /opt/install/sqoop/../accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. 20/12/22 23:33:14 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.14.2 usage: sqoop COMMAND [ARGS] Available commands: codegen Generate code to interact with database records create-hive-table Import a table definition into Hive eval Evaluate a SQL statement and display the results export Export an HDFS directory to a database table help List available commands import Import a table from a database to HDFS import-all-tables Import tables from a database to HDFS import-mainframe Import datasets from a mainframe server to HDFS job Work with saved jobs list-databases List available databases on a server list-tables List available tables in a database merge Merge results of incremental imports metastore Run a standalone Sqoop metastore version Display version information See 'sqoop help COMMAND' for information on a specific command. ``` 出現上面的信息表示安裝成功。<br/> **六、測試Sqoop是否能夠成功連接數據庫** ```sql [root@hadoop101 lib]# sqoop list-databases --connect jdbc:mysql://hadoop101:3306/ --username root --password 123456 Warning: /opt/install/sqoop/../hcatalog does not exist! HCatalog jobs will fail. Please set $HCAT_HOME to the root of your HCatalog installation. Warning: /opt/install/sqoop/../accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. 20/12/22 23:35:16 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.14.2 20/12/22 23:35:16 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 20/12/22 23:35:16 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/opt/install/hadoop-2.6.0-cdh5.14.2/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/opt/install/hbase-1.2.0-cdh5.14.2/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] information_schema hive mysql performance_schema test ``` 出現數據庫信息表示連接成功。<br/> 至此,sqoop的安裝就完成了! <br/> 下面還需要配置三個jar包,在進行數據遷移時才不會報錯。 ```sql (1)從網上下載java-json.jar,上傳到{sqoop_home}/lib目錄下 [root@hadoop101 software]# cp java-json.jar /opt/install/sqoop/lib/ (2)復制{hive_home}/lib/hive-common-*.jar到{sqoop_home}/lib目錄下 [root@hadoop101 lib]# cp hive-common-1.1.0-cdh5.14.2.jar /opt/install/sqoop/lib/ (3)復制{hive_home}/lib/hive-exec-*.jar到{sqoop_home}/lib目錄下 [root@hadoop101 lib]# cp hive-exec-1.1.0-cdh5.14.2.jar /opt/install/sqoop/lib/ ```
                  <ruby id="bdb3f"></ruby>

                  <p id="bdb3f"><cite id="bdb3f"></cite></p>

                    <p id="bdb3f"><cite id="bdb3f"><th id="bdb3f"></th></cite></p><p id="bdb3f"></p>
                      <p id="bdb3f"><cite id="bdb3f"></cite></p>

                        <pre id="bdb3f"></pre>
                        <pre id="bdb3f"><del id="bdb3f"><thead id="bdb3f"></thead></del></pre>

                        <ruby id="bdb3f"><mark id="bdb3f"></mark></ruby><ruby id="bdb3f"></ruby>
                        <pre id="bdb3f"><pre id="bdb3f"><mark id="bdb3f"></mark></pre></pre><output id="bdb3f"></output><p id="bdb3f"></p><p id="bdb3f"></p>

                        <pre id="bdb3f"><del id="bdb3f"><progress id="bdb3f"></progress></del></pre>

                              <ruby id="bdb3f"></ruby>

                              哎呀哎呀视频在线观看