在EC2实例中运行Sqoop时出错

时间:2018-05-22 02:56:13

标签: hadoop amazon-ec2 sqoop

我在我的EC2实例中安装了Sqoop,引用了http://kontext.tech/docs/DataAndBusinessIntelligence/p/configure-sqoop-in-a-edge-node-of-hadoop-cluster我的hadoop集群也运行良好。

我收到了错误Error: Could not find or load main class org.apache.hadoop.mapreduce.v2.app.MRAppMaster,我用该特定链接中提到的解决方案解决了它。但不幸的是,在运行Sqoop Import时出现了另一个错误:

Container exited with a non-zero exit code 1. Error file: prelaunch.err. Last 4096 bytes of prelaunch.err : Last 4096 bytes of stderr : Error: Could not find or load main class org.apache.hadoop.mapred.YarnChild

请建议我如何克服此错误。

这就是我的sqoop-env.template.sh看起来的样子:

# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements.  See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License.  You may obtain a copy of the License at
#     http://www.apache.org/licenses/LICENSE-2.0
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# included in all the hadoop scripts with source command
# should not be executable directly
# also should not be passed any arguments, since we need original $*
# Set Hadoop-specific environment variables here.
#Set path to where bin/hadoop is available
#export HADOOP_COMMON_HOME=$HOME/hadoop-3.1.0
#Set path to where hadoop-*-core.jar is available
#export HADOOP_MAPRED_HOME=$HOME/hadoop-3.1.0
#set the path to where bin/hbase is available
#export HBASE_HOME=
#Set the path to where bin/hive is available
#export HIVE_HOME=
#Set the path for where zookeper config dir is
#export ZOOCFGDIR=`

0 个答案:

没有答案