我正在尝试使UDF函数接受STRUCT作为参数,并在对输入内容进行一些修改(更新现有字段+添加新字段)之后返回结构,如以下代码所示:
@UdfDescription(name = "myCustomUdf")
public class MyCustomUdf {
@Udf(description = "do stuff")
public Struct MyCustomUdf(@UdfParameter(schema = "struct <NAME VARCHAR, EMAIL VARCHAR>", value = "user") final Struct struct) {
String processedEmail = struct.getString("EMAIL").toUpperCase();
struct.put("EMAIL", processedEmail);
return struct;
}
}
但是,当部署我的自定义jar时,我在KSQL日志中看到以下异常,在文档中找不到任何禁止该功能的信息,或者我在这里遗漏了一些东西:
让UDF返回STRUCT是否可行?
io.confluent.ksql.util.KsqlException: Could not load UDF method with signature: public org.apache.kafka.connect.data.Struct com.myudf.MyCustomUdf.MyCustomUdf(org.apache.kafka.connect.data.Struct)
at io.confluent.ksql.function.UdfLoader.getReturnType(UdfLoader.java:373)
at io.confluent.ksql.function.UdfLoader.addFunction(UdfLoader.java:280)
at io.confluent.ksql.function.UdfLoader.lambda$handleUdfAnnotation$8(UdfLoader.java:217)
at io.github.lukehutch.fastclasspathscanner.scanner.ScanSpec$9.lookForMatches(ScanSpec.java:1390)
at io.github.lukehutch.fastclasspathscanner.scanner.ScanSpec.callMatchProcessors(ScanSpec.java:696)
at io.github.lukehutch.fastclasspathscanner.FastClasspathScanner.scan(FastClasspathScanner.java:1606)
at io.github.lukehutch.fastclasspathscanner.FastClasspathScanner.scan(FastClasspathScanner.java:1678)
at io.github.lukehutch.fastclasspathscanner.FastClasspathScanner.scan(FastClasspathScanner.java:1704)
at io.confluent.ksql.function.UdfLoader.loadUdfs(UdfLoader.java:145)
at io.confluent.ksql.function.UdfLoader.lambda$load$2(UdfLoader.java:115)
at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184)
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175)
at java.util.Iterator.forEachRemaining(Iterator.java:116)
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151)
at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418)
at io.confluent.ksql.function.UdfLoader.load(UdfLoader.java:115)
at io.confluent.ksql.rest.server.KsqlRestApplication.buildApplication(KsqlRestApplication.java:473)
at io.confluent.ksql.rest.server.KsqlRestApplication.buildApplication(KsqlRestApplication.java:441)
at io.confluent.ksql.rest.server.KsqlServerMain.createExecutable(KsqlServerMain.java:94)
at io.confluent.ksql.rest.server.KsqlServerMain.main(KsqlServerMain.java:59)
Caused by: io.confluent.ksql.util.KsqlException: Type inference is not supported for: class org.apache.kafka.connect.data.Struct
at io.confluent.ksql.util.SchemaUtil.handleParametrizedType(SchemaUtil.java:378)
at io.confluent.ksql.util.SchemaUtil.lambda$getSchemaFromType$5(SchemaUtil.java:158)
at io.confluent.ksql.util.SchemaUtil.getSchemaFromType(SchemaUtil.java:158)
at io.confluent.ksql.util.SchemaUtil.getSchemaFromType(SchemaUtil.java:153)
at io.confluent.ksql.function.UdfLoader.getReturnType(UdfLoader.java:366)
... 26 more
答案 0 :(得分:0)
KSQL不仅需要用于输入参数的架构,还需要用于输出的架构。就您而言,请确保在@Udf
批注中指定以下内容:
@Udf(description = "do stuff", schema="STRUCT<name VARCHAR, email VARCHAR>")
public Struct MyCustomUdf(@UdfParameter(schema = "struct <NAME VARCHAR, EMAIL VARCHAR>", value = "user") final Struct struct) {
...
}
似乎我们的文档中确实缺少此信息!我添加了一个问题来确保已解决(https://github.com/confluentinc/ksql/issues/3699)。