cassandra触发器创建期间的java.lang.NoClassDefFoundError

时间:2017-08-25 13:29:57

标签: cassandra kafka-producer-api

您好我正在编写小型cassandra触发器,在插入某个表后将信息发送到kafka。这是我的触发器代码:

#!/bin/python3

import MySQLdb

user="some name with unicode codepoints"
passwd="unicode password"
db = MySQLdb.connect(
    host="127.0.0.1", 
    user=user, passwd=passwd,
    charset="utf8mb4",
    read_default_file="./mycnf.cnf")

print("db:", db)

db.close()

这是我的pom文件:

public class InsertDataTrigger implements ITrigger {

    public Collection<Mutation> augment(Partition update) {

        //checking if trigger works and some debug info;
        SimpleDateFormat dateFormat = new SimpleDateFormat("yyyy/MM/dd HH:mm:ss");
        System.out.println("Hello " + dateFormat.format(new Date()));
        System.out.println("This Insert Data Trigger");
        System.out.println("default charset " + Charset.defaultCharset());      //IMPORTANT check if it's important

        //here we're gonna build the message to kafka based on inserted data
        try {
            UnfilteredRowIterator it = update.unfilteredIterator();
            CFMetaData cfMetaData = update.metadata();

            System.out.println("PartitionKey " + new String(update.partitionKey().getKey().array()));
            System.out.println("update.metadata().clusteringColumns().toString() " + update.metadata().clusteringColumns().toString());

            while (it.hasNext()) {
                JSONObject message = new JSONObject();

                Unfiltered un = it.next();
                Clustering clt = (Clustering) un.clustering();

                message.put("partitionkey", new String(update.partitionKey().getKey().array()));

                System.out.println("clt.toString(cfMetaData) " + clt.toString(cfMetaData));
                System.out.println("clt.getRawValues() " + new String(clt.getRawValues()[0].array()));
                System.out.println("partition.columns().toString() " + update.columns().toString());

                message.put("datetime", new String(clt.getRawValues()[0].array()));

                Iterator<Cell> cells = update.getRow(clt).cells().iterator();

                while (cells.hasNext()) {
                    Cell cell = cells.next();
                    System.out.println("cell.column().name.toString() " + cell.column().name.toString());
                    System.out.println("cell.toString()" + cell.toString());
                    Double x = cell.value().getDouble();
                    System.out.println("cell.value().getDouble() " + x);
                    //if(cell.column().name.toString() == "value")
                    System.out.println(x);
                    message.put(cell.column().name.toString(), x);
                    //else
                    //   message.put(cell.column().name.toString(),cell.value().toString());
                }
                System.out.println("un.toString()" + un.toString(cfMetaData));

                if (!message.isEmpty()) {
                    System.out.println(message.toString());

                    //Sending data to kafka
                    Properties props = new Properties();
                    props.put("bootstrap.servers", "localhost:9092");
                    props.put("acks", "all");
                    props.put("retries", 0);
                    props.put("batch.size", 16384);
                    props.put("linger.ms", 1);
                    props.put("buffer.memory", 33554432);
                    props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
                    props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");

                    Producer<String, String> producer = new KafkaProducer<>(props);
                    producer.send(new ProducerRecord<>("test", message.toString()));//move topic name to some properties
                    producer.close();
                }


            }
        } catch (Exception e) {
            e.printStackTrace();
        }

        return Collections.emptyList();
    } }

项目构建正常并创建一个jar文件,但是当我尝试在cassandra中创建触发器时,它会因提到的异常而失败。

1 个答案:

答案 0 :(得分:2)

kafka-clients jar很可能不在Cassandra lib目录中。除非你的项目在其中包含了依赖(即构建一个胖/超级jar)。

可能在kafka-clients jar和Cassandra依赖项中存在依赖冲突问题。特别是org.xerial.snappy snappy-java有不同的版本。它可能会解决但需要注意的事项。如果这是一个问题,你可以建立自己的Kafka客户端jar,其依赖性为阴影,这样它们就不会发生冲突。