我正在尝试使用Java从远程HDFS文件系统读取镶木地板文件。我为此使用了镶木地板-hadoop库。
这是我的代码的样子
public Map run( Map inputs )
{
...
final Configuration conf = new Configuration();
conf.set("fs.hdfs.impl", org.apache.hadoop.hdfs.DistributedFileSystem.class.getName());
conf.set("fs.file.impl", org.apache.hadoop.fs.LocalFileSystem.class.getName());
conf.set("fs.defaultFS", "hdfs://" + connHostName + ":" + connPort);
conf.set("ipc.client.connect.timeout", "10000");
conf.set("ipc.client.connect.max.retries.on.timeouts", "3");
System.setProperty("hadoop.home.dir", "/");
Path path = new Path(filePath);
ParquetMetadata readFooter = ParquetFileReader.readFooter(conf, path, ParquetMetadataConverter.NO_FILTER);
MessageType schema = readFooter.getFileMetaData().getSchema();
...
}
以下是我正在使用的Maven依赖项
<dependency>
<groupId>org.apache.parquet</groupId>
<artifactId>parquet-hadoop</artifactId>
<version>1.9.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>3.1.0</version>
</dependency>
我还尝试添加2个依赖项,即hadoop core和hadoop hdfs
当我在Parquet阅读器代码上运行时,它的工作正常,当我作为反射运行时,我面临的问题是
我用它创建一个胖jar,然后将Class名称和jar一起提供给其他程序,该程序将使用反射运行。
反射代码如下:
String packageName = "com.mycompany.hdfs.parquet.Parquet";
String jarPath = "/Users/.../hdfs-parquet-reader/target/hdfs-parquet-reader-0.0.1-jar-with-dependencies.jar";
ClassLoader child = new URLClassLoader(new URL[] { new URL("file://" + jarPath)}, ClassLoader.getSystemClassLoader());
Class classToLoad = Class.forName(packageName, true, child);
String inputParamsString = "{}";
Object obj = classToLoad.newInstance();
Type type = new TypeToken<Map<String, Object>>() {
}.getType();
Map<String, Object> inputs = gson.fromJson(inputParamsString, type);
Method runMethod = obj.getClass().getDeclaredMethod("run", Map.class);
Object result = runMethod.invoke(obj, inputs);
当我在上面的代码上运行时,我发现ParquetMetadata readFooter = ParquetFileReader.readFooter(conf, path, ParquetMetadataConverter.NO_FILTER);
行中找不到DistributedFileSystem.class
我建立了一个胖子jar,经过验证的jar包含了jar中存在的类org.apache.hdfs.DistributedFileSystem.class。
我还验证了java -cp jarname.jar className.class是否按预期工作。
下面是我的pom文件
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns="http://maven.apache.org/POM/4.0.0"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<parent>
<artifactId>aaa</artifactId>
<groupId>bbb</groupId>
<version>0.0.1</version>
</parent>
<modelVersion>4.0.0</modelVersion>
<packaging>jar</packaging>
<artifactId>hdfs-parquet-reader</artifactId>
<dependencies>
<dependency>
<groupId>org.apache.parquet</groupId>
<artifactId>parquet-hadoop</artifactId>
<version>1.9.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>3.1.0</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>8</source>
<target>8</target>
</configuration>
</plugin>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<archive>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
<!--<finalName>${project.artifactId}-${project.version}-${maven.build.timestamp}</finalName> -->
</configuration>
<executions>
<execution>
<id>make-assembly</id> <!-- this is used for inheritance merges -->
<phase>package</phase> <!-- bind to the packaging phase -->
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
我尝试使用阴影插件并构建阴影jar,但是问题仍然相同。
我听说hadoop-commons库使用Thread.currentThread.getClassLoader()加载类文件,这似乎是个问题。
帮我解决这个问题,
谢谢。
答案 0 :(得分:0)
我自己找到了解决此问题的方法,
问题在这里
ClassLoader child = new URLClassLoader(new URL[] { new URL("file://" + jarPath)}, ClassLoader.getSystemClassLoader());
这里我正在加载系统类加载器,这导致从最终的类路径中删除依赖库,
我将其更改为
ClassLoader child = new URLClassLoader(new URL[] { new URL("file://" + jarPath)}, Thread.currentThread().getContextClassLoader());
这对我有用。