考虑以下错误:
2018-07-12 22:46:36,087 FATAL [main] org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.NoSuchMethodError: com.amazonaws.util.StringUtils.trim(Ljava/lang/String;)Ljava/lang/String;
at com.amazonaws.auth.profile.internal.AwsProfileNameLoader.getEnvProfileName(AwsProfileNameLoader.java:72)
at com.amazonaws.auth.profile.internal.AwsProfileNameLoader.loadProfileName(AwsProfileNameLoader.java:54)
at com.amazonaws.regions.AwsProfileRegionProvider.<init>(AwsProfileRegionProvider.java:40)
at com.amazonaws.regions.DefaultAwsRegionProviderChain.<init>(DefaultAwsRegionProviderChain.java:23)
at com.amazonaws.client.builder.AwsClientBuilder.<clinit>(AwsClientBuilder.java:57)
at com.myorg.udb.DecodeMapper.setup(myMapper.java:71)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:142)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:165)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1635)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:160)
和以下代码:
package com.myorg.udb;
import com.amazonaws.auth.AWSCredentials;
import com.amazonaws.auth.AWSStaticCredentialsProvider;
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.auth.profile.internal.AwsProfileNameLoader;
import com.amazonaws.regions.Regions;
import com.amazonaws.services.s3.model.GetObjectRequest;
import com.amazonaws.services.s3.model.S3Object;
import com.amazonaws.util.StringUtils;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Mapper;
import org.json.simple.JSONObject;
import org.json.simple.parser.JSONParser;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.io.*;
import java.util.*;
public class myMapper extends Mapper<Object, Text, Text, Text> {
@Override
protected void setup(Context context) {
try {
System.out.println(StringUtils.trim("hi"));
} catch(Exception e) {
System.out.println("catch" + e);
}
}
public void map(Object key, Text value, Context context
) throws IOException, InterruptedException {
}
}
这行代码:System.out.println(StringUtils.trim("hi"));
在java.lang.NoSuchMethodError: com.amazonaws.util.StringUtils.trim
中导致我在Qubole中运行但在本地计算机上运行正常。
这是我的POM导入:
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-s3</artifactId>
<version>1.11.365</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-core</artifactId>
<version>1.11.365</version>
</dependency>
带有Uber JAR插件:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.1.1</version>
<configuration>
<!-- put your configurations here -->
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
</plugin>
为什么即使我将com.amazonaws.util.StringUtils.trim
导入文件,将其导入POM并将所有依赖项导出到胖JAR中,Hadoop也找不到com.amazonaws.util.StringUtils.trim
?
我需要什么导入才能使用[INFO] --- maven-dependency-plugin:2.8:tree (default-cli) @ udb-aggregate ---
[INFO] com.org.myproject:jar:0.2.12-SNAPSHOT
[INFO] +- com.amazonaws:aws-java-sdk-s3:jar:1.11.365:compile
[INFO] | +- com.amazonaws:aws-java-sdk-kms:jar:1.11.365:compile
[INFO] | +- com.amazonaws:aws-java-sdk-core:jar:1.11.365:compile
[INFO] | | +- software.amazon.ion:ion-java:jar:1.0.2:compile
[INFO] | | +- com.fasterxml.jackson.core:jackson-databind:jar:2.6.7.1:compile
[INFO] | | | +- com.fasterxml.jackson.core:jackson-annotations:jar:2.6.0:compile
[INFO] | | | \- com.fasterxml.jackson.core:jackson-core:jar:2.6.7:compile
[INFO] | | +- com.fasterxml.jackson.dataformat:jackson-dataformat-cbor:jar:2.6.7:compile
[INFO] | | \- joda-time:joda-time:jar:2.8.1:compile
[INFO] | \- com.amazonaws:jmespath-java:jar:1.11.365:compile
[INFO] +- org.apache.httpcomponents:httpclient:jar:4.5.2:runtime
[INFO] | +- org.apache.httpcomponents:httpcore:jar:4.4.4:compile
[INFO] | +- commons-logging:commons-logging:jar:1.2:compile
[INFO] | \- commons-codec:commons-codec:jar:1.9:compile
[INFO] +- com.googlecode.json-simple:json-simple:jar:1.1:compile
[INFO] +- org.apache.hadoop:hadoop-common:jar:2.8.4:compile
[INFO] | +- org.apache.hadoop:hadoop-annotations:jar:2.8.4:compile
[INFO] | | \- jdk.tools:jdk.tools:jar:1.8:system
[INFO] | +- com.google.guava:guava:jar:11.0.2:compile
[INFO] | +- commons-cli:commons-cli:jar:1.2:compile
[INFO] | +- org.apache.commons:commons-math3:jar:3.1.1:compile
[INFO] | +- xmlenc:xmlenc:jar:0.52:compile
[INFO] | +- commons-io:commons-io:jar:2.4:compile
[INFO] | +- commons-net:commons-net:jar:3.1:compile
[INFO] | +- commons-collections:commons-collections:jar:3.2.2:compile
[INFO] | +- javax.servlet:servlet-api:jar:2.5:compile
[INFO] | +- org.mortbay.jetty:jetty:jar:6.1.26:compile
[INFO] | +- org.mortbay.jetty:jetty-util:jar:6.1.26:compile
[INFO] | +- org.mortbay.jetty:jetty-sslengine:jar:6.1.26:compile
[INFO] | +- javax.servlet.jsp:jsp-api:jar:2.1:runtime
[INFO] | +- com.sun.jersey:jersey-core:jar:1.9:compile
[INFO] | +- com.sun.jersey:jersey-json:jar:1.9:compile
[INFO] | | +- org.codehaus.jettison:jettison:jar:1.1:compile
[INFO] | | +- com.sun.xml.bind:jaxb-impl:jar:2.2.3-1:compile
[INFO] | | | \- javax.xml.bind:jaxb-api:jar:2.2.2:compile
[INFO] | | | +- javax.xml.stream:stax-api:jar:1.0-2:compile
[INFO] | | | \- javax.activation:activation:jar:1.1:compile
[INFO] | | +- org.codehaus.jackson:jackson-jaxrs:jar:1.8.3:compile
[INFO] | | \- org.codehaus.jackson:jackson-xc:jar:1.8.3:compile
[INFO] | +- com.sun.jersey:jersey-server:jar:1.9:compile
[INFO] | | \- asm:asm:jar:3.1:compile
[INFO] | +- log4j:log4j:jar:1.2.17:compile
[INFO] | +- net.java.dev.jets3t:jets3t:jar:0.9.0:compile
[INFO] | | \- com.jamesmurty.utils:java-xmlbuilder:jar:0.4:compile
[INFO] | +- commons-lang:commons-lang:jar:2.6:compile
[INFO] | +- commons-configuration:commons-configuration:jar:1.6:compile
[INFO] | | +- commons-digester:commons-digester:jar:1.8:compile
[INFO] | | | \- commons-beanutils:commons-beanutils:jar:1.7.0:compile
[INFO] | | \- commons-beanutils:commons-beanutils-core:jar:1.8.0:compile
[INFO] | +- org.slf4j:slf4j-api:jar:1.7.10:compile
[INFO] | +- org.slf4j:slf4j-log4j12:jar:1.7.10:compile
[INFO] | +- org.codehaus.jackson:jackson-core-asl:jar:1.9.13:compile
[INFO] | +- org.codehaus.jackson:jackson-mapper-asl:jar:1.9.13:compile
[INFO] | +- org.apache.avro:avro:jar:1.7.4:compile
[INFO] | | +- com.thoughtworks.paranamer:paranamer:jar:2.3:compile
[INFO] | | \- org.xerial.snappy:snappy-java:jar:1.0.4.1:compile
[INFO] | +- com.google.protobuf:protobuf-java:jar:2.5.0:compile
[INFO] | +- com.google.code.gson:gson:jar:2.2.4:compile
[INFO] | +- org.apache.hadoop:hadoop-auth:jar:2.8.4:compile
[INFO] | | +- com.nimbusds:nimbus-jose-jwt:jar:4.41.1:compile
[INFO] | | | +- com.github.stephenc.jcip:jcip-annotations:jar:1.0-1:compile
[INFO] | | | \- net.minidev:json-smart:jar:2.3:compile (version selected from constraint [1.3.1,2.3])
[INFO] | | | \- net.minidev:accessors-smart:jar:1.2:compile
[INFO] | | | \- org.ow2.asm:asm:jar:5.0.4:compile
[INFO] | | +- org.apache.directory.server:apacheds-kerberos-codec:jar:2.0.0-M15:compile
[INFO] | | | +- org.apache.directory.server:apacheds-i18n:jar:2.0.0-M15:compile
[INFO] | | | +- org.apache.directory.api:api-asn1-api:jar:1.0.0-M20:compile
[INFO] | | | \- org.apache.directory.api:api-util:jar:1.0.0-M20:compile
[INFO] | | \- org.apache.curator:curator-framework:jar:2.7.1:compile
[INFO] | +- com.jcraft:jsch:jar:0.1.54:compile
[INFO] | +- org.apache.curator:curator-client:jar:2.7.1:compile
[INFO] | +- org.apache.curator:curator-recipes:jar:2.7.1:compile
[INFO] | +- com.google.code.findbugs:jsr305:jar:3.0.0:compile
[INFO] | +- org.apache.htrace:htrace-core4:jar:4.0.1-incubating:compile
[INFO] | +- org.apache.zookeeper:zookeeper:jar:3.4.6:compile
[INFO] | | \- io.netty:netty:jar:3.7.0.Final:compile
[INFO] | \- org.apache.commons:commons-compress:jar:1.4.1:compile
[INFO] | \- org.tukaani:xz:jar:1.0:compile
[INFO] +- org.apache.hadoop:hadoop-client:jar:2.8.4:compile
[INFO] | +- org.apache.hadoop:hadoop-hdfs-client:jar:2.8.4:compile
[INFO] | | \- com.squareup.okhttp:okhttp:jar:2.4.0:compile
[INFO] | | \- com.squareup.okio:okio:jar:1.4.0:compile
[INFO] | +- org.apache.hadoop:hadoop-mapreduce-client-app:jar:2.8.4:compile
[INFO] | | +- org.apache.hadoop:hadoop-mapreduce-client-common:jar:2.8.4:compile
[INFO] | | | +- org.apache.hadoop:hadoop-yarn-client:jar:2.8.4:compile
[INFO] | | | \- org.apache.hadoop:hadoop-yarn-server-common:jar:2.8.4:compile
[INFO] | | \- org.apache.hadoop:hadoop-mapreduce-client-shuffle:jar:2.8.4:compile
[INFO] | | \- org.fusesource.leveldbjni:leveldbjni-all:jar:1.8:compile
[INFO] | +- org.apache.hadoop:hadoop-yarn-api:jar:2.8.4:compile
[INFO] | +- org.apache.hadoop:hadoop-mapreduce-client-core:jar:2.8.4:compile
[INFO] | | \- org.apache.hadoop:hadoop-yarn-common:jar:2.8.4:compile
[INFO] | | \- com.sun.jersey:jersey-client:jar:1.9:compile
[INFO] | \- org.apache.hadoop:hadoop-mapreduce-client-jobclient:jar:2.8.4:compile
[INFO] \- junit:junit:jar:4.12:test
[INFO] \- org.hamcrest:hamcrest-core:jar:1.3:test
?
这是我的依赖项
export class Person {
private get fullName() {
return this.firstName + '' + this.lastname;
}
constructor(public firstName, public lastname) {
}
sayHi(): string;
sayHi(name: string): string;
sayHi(person: Person): string;
sayHi(obj: any) {
return '';
}
const name = new Person('jim', 'jonson');
答案 0 :(得分:2)
最有可能是由于班级中本地和远程的差异。您提供的JAR也许已经在类路径中的较早版本中出现过,并且已首先加载。查看this answer,了解如何在遥控器上找到包含类文件的JAR。
Class klass = StringUtils.class;
URL location = klass.getResource('/' + klass.getName().replace('.', '/') + ".class");
希望location
中包含JAR版本号,以便您可以针对本地确认远程版本。
由于您已经在着色,因此可以通过重新打包依赖项来解决此问题,将第三方类重新放置在您自己的com.myorg
包中,例如com.amazonaws.util.StringUtils
成为com.myorg.com.amazonaws.util.StringUtils
。参见Using Package Relocation in the maven-shade-plugin。如果依赖关系很多或很大,我不会这样做,但这取决于您。
答案 1 :(得分:0)
我在运行 spark-submit 时遇到了这样的错误。我意识到代码是使用 spark 3.0.0 和 hadoop 3.2 库构建的,我的本地 spark 版本是 2.4.7。我使用 hadoop 3.2 升级到 Spark 3 并复制了 aws-java-sdk-1.11.828.jar(这是我的代码使用的 sdk)来触发 jars 目录并且它工作了!