Kafka Stream Avro加入键

时间:2018-08-09 15:03:38

标签: java join apache-kafka avro

我目前是Kafka的新手,并且有一些麻烦Kafka Streams和Avro。 首先,我有2个流,分别称为“ Observations”和“ FeatureOfInterest”。观察值包含FeatureOfInterest的键。我想将它们合并到一个Output主题中,并在Observation.value.get(“ Location”)= FeatureOfInterest.key()[<-Kafka键]上进行合并,但是也可以通过Observation.value.get(“位置”)= FeatureOfInterest.value.get(“键”)。

输出流包含所有观察数据+ FeatureOfInterest.get(“ fields”)。

在FeatureOfInterest字段中,输出流应具有相同的Avro模式,即“ fields” json

位置和观测都在Avro模式中:

{
  "type": "record",
  "name": "Observation",
  "namespace": "main.java.pw.oliver.jmkb.avroclasses",
  "doc": "An Observation represents a single Sensor reading of an ObservedProperty. A Sensor sends Observations to a specified Datastream.",
  "fields": [
    {
      "name": "iotId",
      "type": "string",
      "doc": "iotId of this Observation"
    },
    {
      "name": "phenomenonTime",
      "type": "string",
      "doc": "Time (ISO 8601) of the phenomenon"
    },
    {
      "name": "resultTime",
      "type": "string",
      "doc": "Time (ISO 8601) of the result"
    },
    {
      "name": "result",
      "type": "string",
      "doc": "Any result value represented as a string"
    },
    {
      "name": "resultQuality",
      "type": [
        "null",
        "string"
      ],
      "doc": "Optional string describing the quality of the result"
    },
    {
      "name": "validTime",
      "type": [
        "null",
        "string"
      ],
      "doc": "Optional time (ISO 8601) of validity"
    },
    {
      "name": "Datastream",
      "type": "string",
      "doc": "Datastream associated with the Observation"
    },
    {
      "name": "FeatureOfInterest",
      "type": [
        "null",
        "string"
      ],
      "doc": "Optional FeatureOfInterest associated with the Observation"
    }
  ]
}

FeatureOfIntreset

{
  "type": "record",
  "name": "FeatureOfInterest",
  "namespace": "main.java.pw.oliver.jmkb.avroclasses",
  "doc": "In the case of remote sensing, the FeatureOfInterest can be the geographical area or volume that is being sensed.",
  "fields": [
    {
      "name": "iotId",
      "type": "string",
      "doc": "iotId of this FeatureOfInterest"
    },
    {
      "name": "name",
      "type": "string",
      "doc": "Name of the FeatureOfInterest"
    },
    {
      "name": "description",
      "type": "string",
      "doc": "Description of the FeatureOfInterest"
    },
    {
      "name": "encodingType",
      "type": "string",
      "doc": "Representation/encoding type of the FeatureOfInterest"
    },
    {
      "name": "feature",
      "type": {
        "type": "record",
        "name": "LocationType",
        "doc": "The type of the location, for example Point",
        "fields": [
          {
            "name": "type",
            "type": "string",
            "doc": "Name of the LocationType"
          },
          {
            "name": "coordinates",
            "type": "string",
            "doc": "Coordinates for the LocationType"
          }
        ]
      },
      "doc": "LocationType object containing the feature of the Thing"
    },
    {
      "name": "Observations",
      "type": [
        "null",
        "string"
      ],
      "doc": "Observations associated with this FeatureOfInterest"
    }
  ]
}

希望您能回答我的问题。

我的问题是我要如何实现这一目标。

另一个问题是,我如何在KStream上实现Avro作为序列化。

    public class MergeB {

        private static final String ObservationTopic = "Observations";
        private static final String FeatureOfIntresssTopic = "FeaturesOfInterest";
        private static final String outputTopic = "ObservationsMerges";

        public static void main(String[] args) throws InterruptedException {


            Properties props = new Properties();

            props.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "192.168.56.3:9092");
            props.put(StreamsConfig.APPLICATION_ID_CONFIG, "Merge");
            props.put(StreamsConfig.CLIENT_ID_CONFIG, "example-client");
            props.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass().getName());
            props.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, GenericAvroSerde.class);
            props.put(AbstractKafkaAvroSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG, "http://192.168.56.3:8081");
            props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
            final Serde<String> stringSerde = Serdes.String();
            final Serde<Long> longSerde = Serdes.Long();



         StreamsBuilder builder = new StreamsBuilder();
        final KTable<String, GenericRecord> foIT = builder.table(FeatureOfIntresssTopic);
        final KTable<String, GenericRecord> obsT = builder.table(ObservationTopic);
        obsT.mapValues(value -> {

            String obsIot = value.get("FeatureOfInterest").toString();
                    System.out.println(obsIot);

                    foIT.mapValues(value1 -> {
                       String foiIot =  value.get("iotId").toString();
                        Boolean test = obsIot.equals(foiIot);
                        System.out.println(obsIot + " " + foiIot + " " + test);

return value1;
                    });



return value;
                }


        );








            KafkaStreams kafkaStreams1 = new KafkaStreams(builder.build(), props);
            kafkaStreams1.start();

            Runtime.getRuntime().addShutdownHook(new Thread(() -> {
                kafkaStreams1.close();

            }));



        }

那应该起作用吗?但我的Programm不在那个FoIt.mapValues

0 个答案:

没有答案