将mysql数据导入hive方法实例
2019/10/10/17:35:47 阅读:2010 来源:谷歌SEO算法 标签:
1号店SEO
live是一个不错的数据库管理工具,下面我来介绍把mysql数据导入hive方法,有需要了解的同学可参考.
下面是从mysql将数据导入hive的实例.
–hive-import 表示导入到hive,–create-hive-table表示创建hive表,–hive-table指定hive的表名.
- [zhouhh@Hadoop46~]$sqoopimport--connectjdbc:mysql://Hadoop48/toplists--verbose-m1--usernameroot--hive-overwrite--direct--tableaward--hive-import--create-hive-table--hive-tablemysql_award--fields-terminated-by't'--lines-terminated-by'n'--append
- 12/07/2016:02:23INFOmanager.MySQLManager:PreparingtouseaMySQLstreamingresultset.
- 12/07/2016:02:23INFOtool.CodeGenTool:Beginningcodegeneration
- 12/07/2016:02:23INFOmanager.SqlManager:ExecutingSQLstatement:SELECTt.*FROM`award`AStLIMIT1
- 12/07/2016:02:24INFOorm.CompilationManager:HADOOP_HOMEis/home/zhouhh/hadoop-1.0.0/libexec/..
注:/tmp/sqoop-zhouhh/compile/2fe3efbc94924ad6391b948ef8f8254f/award.java使用或覆盖了已过时的 API.
注:有关详细信息,请使用 -Xlint:deprecation 重新编译.
- 12/07/2016:02:25ERRORorm.CompilationManager:Couldnotrename/tmp/sqoop-zhouhh/compile/2fe3efbc94924ad6391b948ef8f8254f/award.javato/home/zhouhh/./award.java
- 12/07/2016:02:25INFOorm.CompilationManager:Writingjarfile:/tmp/sqoop-zhouhh/compile/2fe3efbc94924ad6391b948ef8f8254f/award.jar
- 12/07/2016:02:25INFOmanager.DirectMySQLManager:Beginningmysqldumpfastpathimport
- 12/07/2016:02:25INFOmapreduce.ImportJobBase:Beginningimportofaward
- 12/07/2016:02:27INFOmapred.JobClient:Runningjob:job_201207191159_0322
- 12/07/2016:02:28INFOmapred.JobClient:map0%reduce0%
- 12/07/2016:02:41INFOmapred.JobClient:map100%reduce0%
- 12/07/2016:02:46INFOmapred.JobClient:Jobcomplete:job_201207191159_0322
- 12/07/2016:02:46INFOmapred.JobClient:Counters:18
- 12/07/2016:02:46INFOmapred.JobClient:JobCounters
- 12/07/2016:02:46INFOmapred.JobClient:SLOTS_MILLIS_MAPS=12849
- 12/07/2016:02:46INFOmapred.JobClient:Totaltimespentbyallreduceswaitingafterreservingslots(ms)=0
- 12/07/2016:02:46INFOmapred.JobClient:Totaltimespentbyallmapswaitingafterreservingslots(ms)=0
- 12/07/2016:02:46INFOmapred.JobClient:Launchedmaptasks=1
- 12/07/2016:02:46INFOmapred.JobClient:SLOTS_MILLIS_REDUCES=0
- 12/07/2016:02:46INFOmapred.JobClient:FileOutputFormatCounters
- 12/07/2016:02:46INFOmapred.JobClient:BytesWritten=208
- 12/07/2016:02:46INFOmapred.JobClient:FileSystemCounters
- 12/07/2016:02:46INFOmapred.JobClient:HDFS_BYTES_READ=87
- 12/07/2016:02:46INFOmapred.JobClient:FILE_BYTES_WRITTEN=30543
- 12/07/2016:02:46INFOmapred.JobClient:HDFS_BYTES_WRITTEN=208
- 12/07/2016:02:46INFOmapred.JobClient:FileInputFormatCounters
- 12/07/2016:02:46INFOmapred.JobClient:BytesRead=0
- 12/07/2016:02:46INFOmapred.JobClient:Map-ReduceFramework
- 12/07/2016:02:46INFOmapred.JobClient:Mapinputrecords=1
- 12/07/2016:02:46INFOmapred.JobClient:Physicalmemory(bytes)snapshot=78295040
- 12/07/2016:02:46INFOmapred.JobClient:SpilledRecords=0
- 12/07/2016:02:46INFOmapred.JobClient:CPUtimespent(ms)=440
- 12/07/2016:02:46INFOmapred.JobClient:Totalcommittedheapusage(bytes)=56623104
- 12/07/2016:02:46INFOmapred.JobClient:Virtualmemory(bytes)snapshot=901132288
- 12/07/2016:02:46INFOmapred.JobClient:Mapoutputrecords=44
- 12/07/2016:02:46INFOmapred.JobClient:SPLIT_RAW_BYTES=87
- 12/07/2016:02:46INFOmapreduce.ImportJobBase:Transferred208bytesin20.349seconds(10.2216bytes/sec)
- 12/07/2016:02:46INFOmapreduce.ImportJobBase:Retrieved44records.
- 12/07/2016:02:46INFOutil.AppendUtils:Creatingmissingoutputdirectory-award
- 12/07/2016:02:46INFOhive.HiveImport:Removingtemporaryfilesfromimportprocess:award/_logs
- 12/07/2016:02:46INFOhive.HiveImport:LoadinguploadeddataintoHive
- 12/07/2016:02:46INFOmanager.SqlManager:ExecutingSQLstatement:SELECTt.*FROM`award`AStLIMIT1
- 12/07/2016:02:48INFOhive.HiveImport:WARNING:org.apache.hadoop.metrics.jvm.EventCounterisdeprecated.Pleaseuseorg.apache.hadoop.log.metrics.EventCounterinallthelog4j.propertiesfiles.
- 12/07/2016:02:48INFOhive.HiveImport:Logginginitializedusingconfigurationinjar:file:/home/zhouhh/hive-0.8.1/lib/hive-common-0.8.1.jar!/hive-log4j.properties
- 12/07/2016:02:48INFOhive.HiveImport:Hivehistoryfile=/home/zhouhh/hive-0.8.1/logs/hive_job_log_zhouhh_201207201602_1448253330.txt
- 12/07/2016:02:53INFOhive.HiveImport:OK
- 12/07/2016:02:53INFOhive.HiveImport:Timetaken:4.322seconds
- 12/07/2016:02:53INFOhive.HiveImport:Loadingdatatotabledefault.mysql_award
- 12/07/2016:02:53INFOhive.HiveImport:Deletedhdfs://Hadoop46:9200/user/hive/warehouse/mysql_award
- 12/07/2016:02:53INFOhive.HiveImport:OK
- 12/07/2016:02:53INFOhive.HiveImport:Timetaken:0.28seconds
- 12/07/2016:02:53INFOhive.HiveImport:Hiveimportcomplete.
到hive中查询,已经成功导入数据,代码如下:
- hive>select*frommysql_award;
- OK
- 2012-04-2706:55:00:4027136295947433203828240271024027136291001NULL715878221内容内容Aios
- 2012-04-2706:55:00:4067885597784332039301940177804067885591001113835155880亲牛牛旦旦android
- Timetaken:0.368seconds--phpfensi.com
hive>由于基于utf8,所以没有遇到乱码问题.
热门评论