Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
777 views
in Technique[技术] by (71.8m points)

hadoop - Sqoop and Java 7

I'm trying to use sqoop to import a MySQL table into HDFS. I'm using JDK 1.7.0_45 and CDH4.4. I'm actually using cloudera's pre-built VM, except I changed the JDK to 1.7 because I wanted to use the pydev plugin for eclipse. My sqoop version is 1.4.3-cdh4.4.0.

When I run sqoop I get this exception:

Error: commodity : Unsupported major.minor version 51.0

I have seen this error in the past when I did this: 1. compiled to java 7 2. ran an application with java 6.

but that is not what I am doing this time. I believe my sqoop version was compiled to java 6, and I'm running it with java 7, which should be perfectly fine. I think maybe hadoop is launching mapper processes with JDK 6, I have no idea how to change that. I skimmed through the mapred configuration documentation, and did not see any way to set the java version to use for map tasks.

Here is the relevant console output:

[cloudera@localhost ~]$ echo $JAVA_HOME
/usr/java/latest
[cloudera@localhost ~]$ java -version
java version "1.7.0_45"
Java(TM) SE Runtime Environment (build 1.7.0_45-b18)
Java HotSpot(TM) 64-Bit Server VM (build 24.45-b08, mixed mode)
[cloudera@localhost ~]$ sqoop version
Sqoop 1.4.3-cdh4.4.0
git commit id 2cefe4939fd464ba11ef63e81f46bbaabf1f5bc6
Compiled by jenkins on Tue Sep  3 20:41:55 PDT 2013
[cloudera@localhost ~]$ hadoop version
Hadoop 2.0.0-cdh4.4.0
Subversion file:///data/1/jenkins/workspace/generic-package-rhel64-6-0/topdir/BUILD/hadoop-2.0.0-cdh4.4.0/src/hadoop-common-project/hadoop-common -r c0eba6cd38c984557e96a16ccd7356b7de835e79
Compiled by jenkins on Tue Sep  3 19:33:17 PDT 2013
From source with checksum ac7e170aa709b3ace13dc5f775487180
This command was run using /usr/lib/hadoop/hadoop-common-2.0.0-cdh4.4.0.jar
[cloudera@localhost ~]$ cat mysqooper.sh
#!/bin/bash
sqoop import -m 1 --connect jdbc:mysql://localhost/$1 
--username root --table $2 --target-dir $3
[cloudera@localhost ~]$ ./mysqooper.sh cloud commodity /user/cloudera/commodity/csv/sqooped
14/01/16 16:45:10 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
14/01/16 16:45:10 INFO tool.CodeGenTool: Beginning code generation
14/01/16 16:45:11 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `commodity` AS t LIMIT 1
14/01/16 16:45:11 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `commodity` AS t LIMIT 1
14/01/16 16:45:11 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-0.20-mapreduce
14/01/16 16:45:11 INFO orm.CompilationManager: Found hadoop core jar at: /usr/lib/hadoop-0.20-mapreduce/hadoop-core.jar
Note: /tmp/sqoop-cloudera/compile/f75bf6f8829e8eff302db41b01f6796a/commodity.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
14/01/16 16:45:15 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-cloudera/compile/f75bf6f8829e8eff302db41b01f6796a/commodity.jar
14/01/16 16:45:15 WARN manager.MySQLManager: It looks like you are importing from mysql.
14/01/16 16:45:15 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
14/01/16 16:45:15 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
14/01/16 16:45:15 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
14/01/16 16:45:15 INFO mapreduce.ImportJobBase: Beginning import of commodity
14/01/16 16:45:17 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
14/01/16 16:45:20 INFO mapred.JobClient: Running job: job_201401161614_0001
14/01/16 16:45:21 INFO mapred.JobClient:  map 0% reduce 0%
14/01/16 16:45:38 INFO mapred.JobClient: Task Id : attempt_201401161614_0001_m_000000_0, Status : FAILED
Error: commodity : Unsupported major.minor version 51.0
14/01/16 16:45:46 INFO mapred.JobClient: Task Id : attempt_201401161614_0001_m_000000_1, Status : FAILED
Error: commodity : Unsupported major.minor version 51.0
14/01/16 16:45:54 INFO mapred.JobClient: Task Id : attempt_201401161614_0001_m_000000_2, Status : FAILED
Error: commodity : Unsupported major.minor version 51.0
14/01/16 16:46:07 INFO mapred.JobClient: Job complete: job_201401161614_0001
14/01/16 16:46:07 INFO mapred.JobClient: Counters: 6
14/01/16 16:46:07 INFO mapred.JobClient:   Job Counters 
14/01/16 16:46:07 INFO mapred.JobClient:     Failed map tasks=1
14/01/16 16:46:07 INFO mapred.JobClient:     Launched map tasks=4
14/01/16 16:46:07 INFO mapred.JobClient:     Total time spent by all maps in occupied slots (ms)=23048
14/01/16 16:46:07 INFO mapred.JobClient:     Total time spent by all reduces in occupied slots (ms)=0
14/01/16 16:46:07 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
14/01/16 16:46:07 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
14/01/16 16:46:07 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
14/01/16 16:46:07 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 51.0252 seconds (0 bytes/sec)
14/01/16 16:46:07 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
14/01/16 16:46:07 INFO mapreduce.ImportJobBase: Retrieved 0 records.
14/01/16 16:46:07 ERROR tool.ImportTool: Error during import: Import job failed!

I tried running with JDK 1.6 and it works, but I really don't want to switch back to that every time I need to use sqoop.

Does anybody know what I need to change?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

I belive that root cause of your problem is that your Hadoop distribution is still running on JDK6 and not JDK7 as you believe so.

Sqoop process will generate Java code that is compiled with currently used JDK. Therefore if you execute Sqoop on JDK7, it will generate and compile code with this JDK7. The generated code is then submitted to your hadoop cluster as a part of mapreduce job. Therefore if you are getting this unsupported major.minr exception while running Sqoop on JDK7 is very likely that your Hadoop cluster is running on JDK6.

I would strongly suggest calling jinfo on your hadoop deamons to verify which JDK they are running on.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

1.4m articles

1.4m replys

5 comments

57.0k users

...