多态杰克逊反序列化到多个Maven项目中的POJO

时间:2016-06-16 05:50:33

标签: java maven jackson

我有一个像这样的POJO层次结构:

@JsonSubTypes({
  @Type(value = FileShareConnection.class, name = "FileShareConnection"),
  @Type(value = HadoopConnection.class, name = "HadoopConnection")
})
public abstract class Connection
public class FileShareConnection extends Connection
public class HadoopConnection extends Connection

我想使用Jackson将一些JSON反序列化。 我面临的问题是它们都在不同的Maven项目中(FileShare连接在FileShare maven项目中,Connection在API maven项目中)。

因此,我在Maven项目之间存在循环依赖关系(抽象类需要了解子类型,子类型需要了解抽象类)。

知道如何解决这个问题吗?

1 个答案:

答案 0 :(得分:0)

为避免编译时依赖性,您可以使用ObjectMapper#registerSubTypesObjectMapper#registerSubTypes在运行时注册子类型信息。

示例:

import static org.hamcrest.CoreMatchers.instanceOf;
import static org.hamcrest.MatcherAssert.assertThat;

import java.io.StringReader;
import java.io.StringWriter;

import org.junit.Test;

import com.fasterxml.jackson.annotation.JsonTypeInfo;
import com.fasterxml.jackson.annotation.JsonTypeName;
import com.fasterxml.jackson.databind.ObjectMapper;

public class JacksonTest2 {

    // Assuming this is in base Maven module
    @JsonTypeInfo(use = JsonTypeInfo.Id.NAME, include = JsonTypeInfo.As.PROPERTY, property = "_type")
    public static abstract class Connection {
    }

    // Assuming this is in different Maven module
    @JsonTypeName("FileShareConnection")
    public static class FileShareConnection extends Connection {
    }

    // Assuming this is in different Maven module
    @JsonTypeName("HadoopConnection")
    public static class HadoopConnection extends Connection {
    }


   // Assuming both modules are available here.
   // or you need to load classes via reflection(or some library)
    @Test
    public void testUseCustomPolymorphicTypeNameInSerializationOption2() throws Exception {
        ObjectMapper mapper = new ObjectMapper();

        mapper.registerSubtypes(FileShareConnection.class, HadoopConnection.class);

        Connection fileShareConnection = new HadoopConnection();

        StringWriter sw = new StringWriter();

        mapper.writeValue(sw, fileShareConnection);

        Connection value = mapper.readValue(new StringReader(sw.toString()), Connection.class);

        assertThat(value, instanceOf(HadoopConnection.class));
    }
}