<ruby id="bdb3f"></ruby>

    <p id="bdb3f"><cite id="bdb3f"></cite></p>

      <p id="bdb3f"><cite id="bdb3f"><th id="bdb3f"></th></cite></p><p id="bdb3f"></p>
        <p id="bdb3f"><cite id="bdb3f"></cite></p>

          <pre id="bdb3f"></pre>
          <pre id="bdb3f"><del id="bdb3f"><thead id="bdb3f"></thead></del></pre>

          <ruby id="bdb3f"><mark id="bdb3f"></mark></ruby><ruby id="bdb3f"></ruby>
          <pre id="bdb3f"><pre id="bdb3f"><mark id="bdb3f"></mark></pre></pre><output id="bdb3f"></output><p id="bdb3f"></p><p id="bdb3f"></p>

          <pre id="bdb3f"><del id="bdb3f"><progress id="bdb3f"></progress></del></pre>

                <ruby id="bdb3f"></ruby>

                ThinkChat2.0新版上線,更智能更精彩,支持會話、畫圖、視頻、閱讀、搜索等,送10W Token,即刻開啟你的AI之旅 廣告
                **查看關于Hive的參數** <hr/> ```sql [root@hadoop101 /]# sqoop import --help Hive arguments: --create-hive-table 自動在hive中創建表,如果表已經存在則報錯,生產環境中一般不用 --hive-database <database-name> 指定Hive的數據庫 --hive-delims-replacement <arg> Replace Hive record \0x01 and row delimiters (\n\r) from imported string fields with user-defined string --hive-drop-import-delims Drop Hive record \0x01 and row delimiters (\n\r) from imported string fields --hive-home <dir> Override $HIVE_HOME --hive-import 導入數據到Hive中 (Uses Hive's default delimiters if none are set.) --hive-overwrite 覆蓋表原有數據 --hive-partition-key <partition-key> 分區字段 --hive-partition-value <partition-value> 分區值 --hive-table <table-name> 導入數據到Hive的哪張表上 --map-column-hive <arg> Override mapping for specific column to hive types. ``` <br/> **導入數據到Hive表** <hr/> ```sql sqoop import \ --connect jdbc:mysql://hadoop101:3306/sqoop_db \ --table orders \ --username root \ --password 123456 \ --hive-import \ --create-hive-table \ --hive-database h_sqoop_db \ --hive-table orders \ -m 3 ``` <br/> **導入到分區** <hr/> ```sql # 雖然指定--target-dir,但是并會創建該目錄 # --hive-table h_sqoop_db.orders_part相當于 # --hive-database h_sqoop_db # --hive-table orders_part sqoop import \ --connect jdbc:mysql://hadoop101:3306/sqoop_db \ --query "select order_id, order_status from orders where order_date>='2014-07-24' and order_date<'2014-07-26' and \$CONDITIONS" \ --username root \ --password 123456 \ --target-dir /user/data/orders \ --split-by order_status \ --hive-import \ --hive-table h_sqoop_db.orders_part \ --hive-partition-key "order_date" \ --hive-partition-value "20140724" \ -m 3 ```
                  <ruby id="bdb3f"></ruby>

                  <p id="bdb3f"><cite id="bdb3f"></cite></p>

                    <p id="bdb3f"><cite id="bdb3f"><th id="bdb3f"></th></cite></p><p id="bdb3f"></p>
                      <p id="bdb3f"><cite id="bdb3f"></cite></p>

                        <pre id="bdb3f"></pre>
                        <pre id="bdb3f"><del id="bdb3f"><thead id="bdb3f"></thead></del></pre>

                        <ruby id="bdb3f"><mark id="bdb3f"></mark></ruby><ruby id="bdb3f"></ruby>
                        <pre id="bdb3f"><pre id="bdb3f"><mark id="bdb3f"></mark></pre></pre><output id="bdb3f"></output><p id="bdb3f"></p><p id="bdb3f"></p>

                        <pre id="bdb3f"><del id="bdb3f"><progress id="bdb3f"></progress></del></pre>

                              <ruby id="bdb3f"></ruby>

                              哎呀哎呀视频在线观看