jmeter kafka使用者抛出错误为[ClassCastException:[Ljava.lang.String;无法强制转换为java.util.List]

时间:2018-05-03 12:19:48

标签: jmeter apache-kafka kafka-consumer-api

我正在尝试使用jsr223采样器在jmeter中使用kafka消费者读取kafka消息。我无法理解错误

  

[响应消息:javax.script.ScriptException:javax.script.ScriptException:java.lang.ClassCastException:[Ljava.lang.String;不能转换为java.util.List]

请帮我解决问题,以便我可以使用kafka消费者订阅和使用消息。

import java.util.Properties;
import java.util.Arrays;
import org.apache.kafka.clients.consumer.KafkaConsumer;
import org.apache.kafka.clients.consumer.ConsumerRecords;
import org.apache.kafka.clients.consumer.ConsumerRecord;


Properties props = new Properties();
String groupID = "REQUEST_RESPONSE_JOB_GROUP";
String clientID =  "REQUEST_RESPONSE_JOB_CLIENT";
String BSID = "kafka:9092";
String topic = "PROC_REST_EVENTS";
props.put("bootstrap.servers", BSID);
props.put("group.id", groupID);
props.put("client.id", clientID);
props.put("enable.auto.commit", "true");
props.put("auto.commit.interval.ms", "1000");
props.put("session.timeout.ms", "30000");
props.put("key.deserializer","org.apache.kafka.common.serialization.StringDeserializer");
props.put("value.deserializer","org.apache.kafka.common.serialization.StringDeserializer");
props.put("partition.assignment.strategy","org.apache.kafka.clients.consumer.RangeAssignor");

KafkaConsumer<String, String> consumer = new KafkaConsumer<String, String>(props);
//Kafka Consumer subscribes list of topics here.
consumer.subscribe(Arrays.asList(topic));
//print the topic name
System.out.println("Subscribed to topic " + topic);

while (true) {
    ConsumerRecords<String, String> records = consumer.poll(100);
    for (ConsumerRecord<String, String> record : records)
        // print the offset,key and value for the consumer records.
        System.out.printf("offset = %d, key = %s, value = %s\n", 
    record.offset(), record.key(), record.value());
    return records;
}

1 个答案:

答案 0 :(得分:0)

您的消费者期望List时,您很可能从Kafka主题获得String,您需要修改消费者配置以匹配来自该主题的类型。

尝试以下Groovy代码,该代码向test主题发送3条消息(如果它不存在,则需要create it)并在此之后读取它们。

import org.apache.kafka.clients.consumer.ConsumerConfig
import org.apache.kafka.clients.consumer.KafkaConsumer
import org.apache.kafka.clients.producer.KafkaProducer
import org.apache.kafka.clients.producer.ProducerConfig
import org.apache.kafka.clients.producer.ProducerRecord
import org.apache.kafka.common.serialization.LongDeserializer
import org.apache.kafka.common.serialization.LongSerializer
import org.apache.kafka.common.serialization.StringDeserializer
import org.apache.kafka.common.serialization.StringSerializer

def BOOTSTRAP_SERVERS = 'localhost:9092'
def TOPIC = 'test'
Properties kafkaProps = new Properties()
kafkaProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, BOOTSTRAP_SERVERS)
kafkaProps.put(ProducerConfig.CLIENT_ID_CONFIG, 'KafkaExampleProducer')
kafkaProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, LongSerializer.class.getName())
kafkaProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName())
kafkaProps.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, BOOTSTRAP_SERVERS)
kafkaProps.put(ConsumerConfig.GROUP_ID_CONFIG, 'KafkaExampleConsumer')
kafkaProps.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, LongDeserializer.class.getName())
kafkaProps.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class.getName())
def producer = new KafkaProducer<>(kafkaProps)
def consumer = new KafkaConsumer<>(kafkaProps)
1.upto(3) {
    def record = new ProducerRecord<>(TOPIC, it as long, 'Hello from JMeter ' + it)
    producer.send(record)
    log.info('Sent record(key=' + record.key() + 'value=' + record.value() + ')')
}

consumer.subscribe(Collections.singletonList(TOPIC))
final int giveUp = 100
int noRecordsCount = 0
while (true) {
    def consumerRecords = consumer.poll(1000)
    if (consumerRecords.count() == 0) {
        noRecordsCount++
        if (noRecordsCount > giveUp) break
        else continue
    }
    consumerRecords.each { record ->
        log.info('Received Record:(' + record.key() + ', ' + record.value() + ')')
    }
    consumer.commitAsync()

}
consumer.close()

您应该看到如下输出:

JMeter Kafka Produce And Consume Message

完成后,您应该可以使用上面的代码作为您自己的Kafka消息测试的基础。有关使用JMeter进行Kafka负载测试的更多信息,请参阅Apache Kafka - How to Load Test with JMeter文章。