Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected

Jacos 发布于 2014/04/18 09:58
阅读 4K+
收藏 0

环境:用hadoop2.2 ,sqoop是1.44-2.0

在执行sqoop import --connect jdbc:mysql://192.168.1.107:3306/digta \--username root --password hadoop --table department --columns "id,name,describes"

的时候出现错误提示

Exception in thread "main" java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected

###############

word hadoop --table department --columns "id,name,describes"
14/04/17 22:03:23 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
14/04/17 22:03:23 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
14/04/17 22:03:23 INFO tool.CodeGenTool: Beginning code generation
14/04/17 22:03:24 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `department` AS t LIMIT 1
14/04/17 22:03:24 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `department` AS t LIMIT 1
14/04/17 22:03:24 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /data1/hadoop-2.2.0/mr
注: /tmp/sqoop-hadoop/compile/f657f722493059c920779b863d8e8c94/department.java使用或覆盖了已过时的 API。
注: 有关详细信息, 请使用 -Xlint:deprecation 重新编译。
14/04/17 22:03:26 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/f657f722493059c920779b863d8e8c94/department.jar
14/04/17 22:03:26 WARN manager.MySQLManager: It looks like you are importing from mysql.
14/04/17 22:03:26 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
14/04/17 22:03:26 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
14/04/17 22:03:26 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
14/04/17 22:03:26 INFO mapreduce.ImportJobBase: Beginning import of department
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/data1/hadoop-2.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/data1/hbase098/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
14/04/17 22:03:26 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
14/04/17 22:03:27 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
14/04/17 22:03:27 INFO client.RMProxy: Connecting to ResourceManager at master/192.168.1.103:8032
14/04/17 22:03:29 INFO mapreduce.JobSubmitter: Cleaning up the staging area /tmp/hadoop-yarn/staging/hadoop/.staging/job_1397453348622_0005
Exception in thread "main" java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected
    at org.apache.sqoop.config.ConfigurationHelper.getJobNumMaps(ConfigurationHelper.java:53)
    at com.cloudera.sqoop.config.ConfigurationHelper.getJobNumMaps(ConfigurationHelper.java:36)
    at org.apache.sqoop.mapreduce.db.DataDrivenDBInputFormat.getSplits(DataDrivenDBInputFormat.java:121)
    at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:491)
    at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:508)
    at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:392)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
    at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1286)
    at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:186)
    at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:159)
    at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:239)
    at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:600)
    at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:413)
    at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:502)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
    at org.apache.sqoop.Sqoop.main(Sqoop.java:238)


加载中
0
千斤难买春秋醉
千斤难买春秋醉

代码用的0.x而hadoop是2.x?

Jacos
Jacos
回复 @闵开慧 : 解决了是版本问题,换sqoop1.43 hadoop2的就可以
闵开慧
闵开慧
@Jacos 请问这个问题解决了吗,自己也遇到了这个问题
千斤难买春秋醉
千斤难买春秋醉
@Jacos 代码是你写的吗
Jacos
Jacos
代码0.x?,hadoop是2.2的
0
闵开慧
闵开慧
可以参考这里http://my.oschina.net/mkh/blog/264112
返回顶部
顶部