将mysql数据导入hive方法实例

SEO研究中心 SEO研究中心提供免费SEO公开课

live是一个不错的数据库管理工具,下面我来介绍把mysql数据导入hive方法,有需要了解的同学可参考.

下面是从mysql将数据导入hive的实例.

–hive-import 表示导入到hive,–create-hive-table表示创建hive表,–hive-table指定hive的表名.

  1. [zhouhh@Hadoop46~]$sqoopimport--connectjdbc:mysql://Hadoop48/toplists--verbose-m1--usernameroot--hive-overwrite--direct--tableaward--hive-import--create-hive-table--hive-tablemysql_award--fields-terminated-by't'--lines-terminated-by'n'--append
  2. 12/07/2016:02:23INFOmanager.MySQLManager:PreparingtouseaMySQLstreamingresultset.
  3. 12/07/2016:02:23INFOtool.CodeGenTool:Beginningcodegeneration
  4. 12/07/2016:02:23INFOmanager.SqlManager:ExecutingSQLstatement:SELECTt.*FROM`award`AStLIMIT1
  5. 12/07/2016:02:24INFOorm.CompilationManager:HADOOP_HOMEis/home/zhouhh/hadoop-1.0.0/libexec/..

注:/tmp/sqoop-zhouhh/compile/2fe3efbc94924ad6391b948ef8f8254f/award.java使用或覆盖了已过时的 API.

注:有关详细信息,请使用 -Xlint:deprecation 重新编译.

  1. 12/07/2016:02:25ERRORorm.CompilationManager:Couldnotrename/tmp/sqoop-zhouhh/compile/2fe3efbc94924ad6391b948ef8f8254f/award.javato/home/zhouhh/./award.java
  2. 12/07/2016:02:25INFOorm.CompilationManager:Writingjarfile:/tmp/sqoop-zhouhh/compile/2fe3efbc94924ad6391b948ef8f8254f/award.jar
  3. 12/07/2016:02:25INFOmanager.DirectMySQLManager:Beginningmysqldumpfastpathimport
  4. 12/07/2016:02:25INFOmapreduce.ImportJobBase:Beginningimportofaward
  5. 12/07/2016:02:27INFOmapred.JobClient:Runningjob:job_201207191159_0322
  6. 12/07/2016:02:28INFOmapred.JobClient:map0%reduce0%
  7. 12/07/2016:02:41INFOmapred.JobClient:map100%reduce0%
  8. 12/07/2016:02:46INFOmapred.JobClient:Jobcomplete:job_201207191159_0322
  9. 12/07/2016:02:46INFOmapred.JobClient:Counters:18
  10. 12/07/2016:02:46INFOmapred.JobClient:JobCounters
  11. 12/07/2016:02:46INFOmapred.JobClient:SLOTS_MILLIS_MAPS=12849
  12. 12/07/2016:02:46INFOmapred.JobClient:Totaltimespentbyallreduceswaitingafterreservingslots(ms)=0
  13. 12/07/2016:02:46INFOmapred.JobClient:Totaltimespentbyallmapswaitingafterreservingslots(ms)=0
  14. 12/07/2016:02:46INFOmapred.JobClient:Launchedmaptasks=1
  15. 12/07/2016:02:46INFOmapred.JobClient:SLOTS_MILLIS_REDUCES=0
  16. 12/07/2016:02:46INFOmapred.JobClient:FileOutputFormatCounters
  17. 12/07/2016:02:46INFOmapred.JobClient:BytesWritten=208
  18. 12/07/2016:02:46INFOmapred.JobClient:FileSystemCounters
  19. 12/07/2016:02:46INFOmapred.JobClient:HDFS_BYTES_READ=87
  20. 12/07/2016:02:46INFOmapred.JobClient:FILE_BYTES_WRITTEN=30543
  21. 12/07/2016:02:46INFOmapred.JobClient:HDFS_BYTES_WRITTEN=208
  22. 12/07/2016:02:46INFOmapred.JobClient:FileInputFormatCounters
  23. 12/07/2016:02:46INFOmapred.JobClient:BytesRead=0
  24. 12/07/2016:02:46INFOmapred.JobClient:Map-ReduceFramework
  25. 12/07/2016:02:46INFOmapred.JobClient:Mapinputrecords=1
  26. 12/07/2016:02:46INFOmapred.JobClient:Physicalmemory(bytes)snapshot=78295040
  27. 12/07/2016:02:46INFOmapred.JobClient:SpilledRecords=0
  28. 12/07/2016:02:46INFOmapred.JobClient:CPUtimespent(ms)=440
  29. 12/07/2016:02:46INFOmapred.JobClient:Totalcommittedheapusage(bytes)=56623104
  30. 12/07/2016:02:46INFOmapred.JobClient:Virtualmemory(bytes)snapshot=901132288
  31. 12/07/2016:02:46INFOmapred.JobClient:Mapoutputrecords=44
  32. 12/07/2016:02:46INFOmapred.JobClient:SPLIT_RAW_BYTES=87
  33. 12/07/2016:02:46INFOmapreduce.ImportJobBase:Transferred208bytesin20.349seconds(10.2216bytes/sec)
  34. 12/07/2016:02:46INFOmapreduce.ImportJobBase:Retrieved44records.
  35. 12/07/2016:02:46INFOutil.AppendUtils:Creatingmissingoutputdirectory-award
  36. 12/07/2016:02:46INFOhive.HiveImport:Removingtemporaryfilesfromimportprocess:award/_logs
  37. 12/07/2016:02:46INFOhive.HiveImport:LoadinguploadeddataintoHive
  38. 12/07/2016:02:46INFOmanager.SqlManager:ExecutingSQLstatement:SELECTt.*FROM`award`AStLIMIT1
  39. 12/07/2016:02:48INFOhive.HiveImport:WARNING:org.apache.hadoop.metrics.jvm.EventCounterisdeprecated.Pleaseuseorg.apache.hadoop.log.metrics.EventCounterinallthelog4j.propertiesfiles.
  40. 12/07/2016:02:48INFOhive.HiveImport:Logginginitializedusingconfigurationinjar:file:/home/zhouhh/hive-0.8.1/lib/hive-common-0.8.1.jar!/hive-log4j.properties
  41. 12/07/2016:02:48INFOhive.HiveImport:Hivehistoryfile=/home/zhouhh/hive-0.8.1/logs/hive_job_log_zhouhh_201207201602_1448253330.txt
  42. 12/07/2016:02:53INFOhive.HiveImport:OK
  43. 12/07/2016:02:53INFOhive.HiveImport:Timetaken:4.322seconds
  44. 12/07/2016:02:53INFOhive.HiveImport:Loadingdatatotabledefault.mysql_award
  45. 12/07/2016:02:53INFOhive.HiveImport:Deletedhdfs://Hadoop46:9200/user/hive/warehouse/mysql_award
  46. 12/07/2016:02:53INFOhive.HiveImport:OK
  47. 12/07/2016:02:53INFOhive.HiveImport:Timetaken:0.28seconds
  48. 12/07/2016:02:53INFOhive.HiveImport:Hiveimportcomplete.

到hive中查询,已经成功导入数据,代码如下:

  1. hive>select*frommysql_award;
  2. OK
  3. 2012-04-2706:55:00:4027136295947433203828240271024027136291001NULL715878221内容内容Aios
  4. 2012-04-2706:55:00:4067885597784332039301940177804067885591001113835155880亲牛牛旦旦android
  5. Timetaken:0.368seconds--phpfensi.com

hive>由于基于utf8,所以没有遇到乱码问题.

相关广告
  • 将mysql数据导入hive方法实例 将mysql数据导入hive方法实例 将mysql数据导入hive方法实例
相关阅读

将mysql数据导入hive方法实例

2019/10/10 17:35:47 | 谷歌SEO算法 | 1号店SEO