Sqoop,没有工作

时间:2016-01-09 02:22:49

标签: mysql hadoop sqoop

我是“Hadoop生态系统”的新手。我的意图是使用Sqoop将数据从“MySQL”传输到“HDF”。我正确配置(我认为)不同的工具,但我在Sqoop中运行“作业”时遇到了问题。

我的环境:

  • OS:Kubuntu 15.10 wily 64bit x86_64
  • Hadoop 2.7.1(稳定)
  • MySql 5.6
  • Sqoop 1.99.6(Sqoop2)

根据这两个链接Sqoop 5 Minutes& sqoop2-activity-finally,我以这种方式创建了两个链接:

sqoop:000> show connector    
+----+------------------------+---------+-------------------------------------------
| Id |          Name          | Version |                        Class              
+----+------------------------+---------+-------------------------------------------
| 1  | generic-jdbc-connector | 1.99.6  | org.apache.sqoop.connector.jdbc.GenericJdb
| 2  | kite-connector         | 1.99.6  | org.apache.sqoop.connector.kite.KiteConnec
| 3  | hdfs-connector         | 1.99.6  | org.apache.sqoop.connector.hdfs.HdfsConnec
| 4  | kafka-connector        | 1.99.6  | org.apache.sqoop.connector.kafka.KafkaConn
+----+------------------------+---------+-------------------------------------------
sqoop:000>  create link -c 1  
Creating link for connector with id 1
Please fill following values to create new link object
Name: mysqlink

Link configuration

JDBC Driver Class: com.mysql.jdbc.Driver
JDBC Connection String: jdbc:mysql://localhost:3306/sqooptest
Username: squser
Password: *****
JDBC Connection Properties: 
There are currently 0 values in the map:
entry# protocol=tcp
There are currently 1 values in the map:
protocol = tcp
entry# 
New link was successfully created with validation status OK and persistent id 2
sqoop:000>

sqoop:000>  create link -c 3
Creating link for connector with id 3
Please fill following values to create new link object
Name: hdfslink

Link configuration

HDFS URI: hdfs://localhost:9000/
Hadoop conf directory: /usr/local/hadoop/etc/hadoop 
New link was successfully created with validation status OK and persistent id 3
sqoop:000> 

sqoop:000> show link --all
2 link(s) to show: 
link with id 2 and name mysqlink (Enabled: true, Created by hduser at 07/01/16 11.52, Updated by hduser at 07/01/16 11.52)
Using Connector generic-jdbc-connector with id 1
  Link configuration
    JDBC Driver Class: com.mysql.jdbc.Driver
    JDBC Connection String: jdbc:mysql://localhost:3306/sqooptest
    Username: squser
    Password: 
    JDBC Connection Properties: 
      protocol = tcp
link with id 3 and name hdfslink (Enabled: true, Created by hduser at 07/01/16 11.57, Updated by hduser at 07/01/16 11.57)
Using Connector hdfs-connector with id 3
  Link configuration
    HDFS URI: hdfs://localhost:9000/
    Hadoop conf directory: /usr/local/hadoop/etc/hadoop
sqoop:000> show link                 
+----+----------+--------------+------------------------+---------+
| Id |   Name   | Connector Id |     Connector Name     | Enabled |
+----+----------+--------------+------------------------+---------+
| 2  | mysqlink | 1            | generic-jdbc-connector | true    |
| 3  | hdfslink | 3            | hdfs-connector         | true    |
+----+----------+--------------+------------------------+---------+

......后来,我创造了一份工作:

sqoop:000> create job --from 2 --to 3
Creating job for links with from id 2 and to id 3
Please fill following values to create new job object
Name: My2Dfs

From database configuration

Schema name: sqooptest
Table name: person
Table SQL statement: 
Table column names: 
Partition column name: id
Null value allowed for the partition column: 
Boundary query: 

Incremental read

Check column: 
Last value: 

To HDFS configuration

Override null value: 
Null value: 
Output format: 
  0 : TEXT_FILE
  1 : SEQUENCE_FILE
Choose: 0
Compression format: 
  0 : NONE
  1 : DEFAULT
  2 : DEFLATE
  3 : GZIP
  4 : BZIP2
  5 : LZO
  6 : LZ4
  7 : SNAPPY
  8 : CUSTOM
Choose: 0
Custom compression format: 
Output directory: /usr/local/sqoop/prog
Append mode: 

Throttling resources

Extractors: 
Loaders: 
New job was successfully created with validation status OK  and persistent id 1

此时,当我尝试激活“作业”时,我收到以下错误(Tomcat 6)。

sqoop:000> start job -j 1 -s
Exception has occurred during processing command 
Exception: org.apache.sqoop.common.SqoopException Message: CLIENT_0001:Server has returned exception - <html><head><title>Apache Tomcat/6.0.37 - Error report</title><body><h1>HTTP Status 500 - Servlet execution threw an exception</h1><HR size="1" noshade="noshade"><p><b>type</b> Exception report</p><p><b>message</b> <u>Servlet execution threw an exception</u></p><p><b>description</b> <u>The server encountered an internal error that prevented it from fulfilling this request.</u></p><p><b>exception</b> <pre>javax.servlet.ServletException: Servlet execution threw an exception
        org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:595)
        org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:291)
        org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:554)
</pre></p><p><b>root cause</b> <pre>java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/util/Apps
        java.lang.ClassLoader.defineClass1(Native Method)
        java.lang.ClassLoader.defineClass(ClassLoader.java:800)
        [...  
  org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:554)
</pre></p><p><b>note</b> <u>The full stack trace of the root cause is available in the Apache Tomcat/6.0.37 logs.</u></p><HR size="1" noshade="noshade"><h3>Apache Tomcat/6.0.37</h3></body></html>

sqoop:000> 

为什么Sqoop使用Tomcat 6?我已经有Tomcat 7了。 我尝试在“/home/hduser/.bashrc”环境变量CATALINA_HOME中添加,但仍然有其他错误。 我的问题是什么?我怎么能提出来呢?

修改

/usr/local/sqoop/server/logs/catalina.2016-01-09 中的

这是错误:

INFORMAZIONI: The APR based Apache Tomcat Native library which allows optimal performance in production environments was not found on the java.library.path: /usr/local/hadoop/lib/native:/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
gen 09, 2016 3:05:26 AM org.apache.coyote.http11.Http11Protocol init
GRAVE: Error initializing endpoint
java.net.BindException: Indirizzo già in uso <null>:12000
    at org.apache.tomcat.util.net.JIoEndpoint.init(JIoEndpoint.java:549)

我通过更改 sqoop / bin / sqoop-sys.sh 中环境变量 SQOOP_ADMIN_PORT 的值来解决了这个问题 然后我把我的“common.loader”放在其他库hadoop的路径上。 但是,当我开始时,我在 sqoop / server / logs / catalina.out 中出现此错误  sqoop2服务器

log4j:ERROR A "org.apache.log4j.ConsoleAppender" object is not assignable to a "org.apache.log4j.Appender" variable.
log4j:ERROR The class "org.apache.log4j.Appender" was loaded by 
log4j:ERROR [org.apache.catalina.loader.StandardClassLoader@4605a23b] whereas object of type 
log4j:ERROR "org.apache.log4j.ConsoleAppender" was loaded by [WebappClassLoader
  context: /sqoop
  delegate: false
  repositories:
    /WEB-INF/classes/
----------> Parent Classloader:
org.apache.catalina.loader.StandardClassLoader@4605a23b
].
log4j:ERROR Could not instantiate appender named "stdout".

1 个答案:

答案 0 :(得分:0)

Tomcat例外:

Exception has occurred during processing command 
Exception: org.apache.sqoop.common.SqoopException Message: CLIENT_0001:Server has returned exception - <html><head><title>Apache Tomcat/6.0.37 - Error report</title><body><h1>HTTP Status 500 - Servlet execution threw an exception</h1><HR size="1" noshade="noshade"><p><b>type</b> Exception report</p><p><b>message</b> <u>Servlet execution threw an exception</u></p><p><b>description</b> <u>The server encountered an internal error that prevented it from fulfilling this request.</u></p><p><b>exception</b> <pre>javax.servlet.ServletException: Servlet execution threw an exception
        org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:595)
        org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:291)
        org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:554)
</pre></p><p><b>root cause</b> <pre>java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/util/Apps
        java.lang.ClassLoader.defineClass1(Native Method)
        java.lang.ClassLoader.defineClass(ClassLoader.java:800)

是因为 common.loader 中没有加载纱线的库。 只需在 common.loader /usr/local/hadoop/share/hadoop/yarn/*.jar)中添加路径/usr/local/sqoop/server/conf/catalina.properties

Log4j的错误,

log4j:ERROR A "org.apache.log4j.ConsoleAppender" object is not assignable to a "org.apache.log4j.Appender" variable.
log4j:ERROR The class "org.apache.log4j.Appender" was loaded by 
log4j:ERROR [org.apache.catalina.loader.StandardClassLoader@4605a23b] whereas object of type 
log4j:ERROR "org.apache.log4j.ConsoleAppender" was loaded by [WebappClassLoader
  context: /sqoop
  delegate: false
  repositories:
    /WEB-INF/classes/
----------> Parent Classloader:
org.apache.catalina.loader.StandardClassLoader@4605a23b
].
log4j:ERROR Could not instantiate appender named "stdout".

相反,我认为这是一个问题,因为即使“log4j.jar”只在common.loader中加载一次也存在。 但是我认为这是一个小问题,实际上等了几秒钟, 您可以在catalina.out

中看到以下消息
gen 13, 2016 12:02:17 PM org.apache.catalina.startup.Catalina start
INFORMAZIONI: Server startup in 57151 ms