我正在尝试确定是否可以使用Kafka的交易功能在交易中写入两个主题。
我知道使用Kafka交易的典型场景是在消费者-生产者模式下,并且似乎有据可查。
我尝试过的事情:
KafkaTransactionManager
ProducerFactory
配置为使用各自的交易管理器ChainedTransactionManger
的两个实例创建了一个KafkaTransactionManager
为每个主题创建一个KafkaTemplate
然后,我在执行以下操作的方法上使用了@Transactional(transactionManager = "chainedTx")
批注:
template1.send("topic1", "example payload");
template2.send("topic2", "example payload");
这不起作用。 KafkaTemplate
是事务性的,但是调用send()
方法时,没有事务正在进行,因此我得到了IllegalStateException
。
我打算尝试KafkaTemplate.executeInTransaction()
方法,但是Javadoc指出这仅用于本地事务,因此它似乎不符合我的需求。
我的下一步是尝试直接使用Kafka的Producer API来查看此模式是否有效,但是如果有人可以告诉我我浪费时间并且Kafka不支持以事务方式写入多个文件,我将不胜感激主题。
我确实在Confluent关于Kafka交易支持的博客中找到了这一说法:
事务使对多个Kafka主题和分区的原子写入成为可能...
但是我还没有找到任何可以证明这一点的例子。
第一个生产者的配置
@配置 公共类ControlProducerConfig {
@Bean("controlTransactionManager")
KafkaTransactionManager<String, String> transactionManager() {
return new KafkaTransactionManager<>(factory());
}
@Bean("controlTemplate")
public KafkaTemplate<String, String> template() {
return new KafkaTemplate<>(factory());
}
private ProducerFactory<String, String> factory() {
DefaultKafkaProducerFactory<String, String> factory = new DefaultKafkaProducerFactory<>(config());
factory.setTransactionIdPrefix("abcd");
return factory;
}
private Map<String, Object> config() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "xxx.xxx.xxx.xxx");
props.put("schema.registry.url", "http://xxx.xxx.xxx.xxx/");
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, KafkaAvroSerializer.class);
// you can't set idempotence without setting max in flight requests to <= 5
props.put(ProducerConfig.MAX_IN_FLIGHT_REQUESTS_PER_CONNECTION, 1);
props.put(ProducerConfig.ENABLE_IDEMPOTENCE_CONFIG, "true");
props.put(ProducerConfig.TRANSACTIONAL_ID_CONFIG, "1234");
return props;
}
}
第二个生产者的配置
@Configuration
public class PayloadProducerConfig {
@Bean("payloadTransactionManager")
KafkaTransactionManager<String, String> transactionManager() {
return new KafkaTransactionManager<>(factory());
}
@Bean("payloadTemplate")
public KafkaTemplate<String, String> template() {
return new KafkaTemplate<>(factory());
}
private ProducerFactory<String, String> factory() {
DefaultKafkaProducerFactory<String, String> factory = new DefaultKafkaProducerFactory<>(config());
factory.setTransactionIdPrefix("abcd");
return factory;
}
private Map<String, Object> config() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "xxx.xxx.xxx.xxx");
props.put("schema.registry.url", "http://xxx.xxx.xxx.xxx/");
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, KafkaAvroSerializer.class);
// you can't set idempotence without setting max in flight requests to <= 5
props.put(ProducerConfig.MAX_IN_FLIGHT_REQUESTS_PER_CONNECTION, 1);
props.put(ProducerConfig.ENABLE_IDEMPOTENCE_CONFIG, "true");
props.put(ProducerConfig.TRANSACTIONAL_ID_CONFIG, "1234");
return props;
}
}
主班
@EnableTransactionManagement
@SpringBootApplication
public class App {
public static void main(String[] args) {
SpringApplication.run(App.class, args);
}
@Bean("chainedTx")
public ChainedTransactionManager chained(
@Qualifier("controlTransactionManager") KafkaTransactionManager controlTransactionManager,
@Qualifier("payloadTransactionManager") KafkaTransactionManager payloadTransactionManager) {
return new ChainedTransactionManager(controlTransactionManager, payloadTransactionManager);
}
@Bean OnStart onStart(PostTwoMessages postTwoMessages) {
return new OnStart(postTwoMessages);
}
@Bean
public PostTwoMessages postTwoMessages(
@Qualifier("controlTemplate") KafkaTemplate<String, String> controlTemplate,
@Qualifier("controlTemplate") KafkaTemplate<String, String> payloadTemplate) {
return new PostTwoMessages(controlTemplate, payloadTemplate);
}
}
在应用程序启动时
public class OnStart implements ApplicationListener<ApplicationReadyEvent> {
private PostTwoMessages postTwoMessages;
public OnStart(PostTwoMessages postTwoMessages) {
this.postTwoMessages = postTwoMessages;
}
@Override
public void onApplicationEvent(ApplicationReadyEvent event) {
postTwoMessages.run();
}
}
发布两条消息
public class PostTwoMessages {
private final KafkaTemplate<String, String> controlTemplate;
private final KafkaTemplate<String, String> payloadTemplate;
public PostTwoMessages(
@Qualifier("controlTemplate") KafkaTemplate<String, String> controlTemplate,
@Qualifier("payloadTemplate") KafkaTemplate<String, String> payloadTemplate) {
this.controlTemplate = controlTemplate;
this.payloadTemplate = payloadTemplate;
}
@Transactional(transactionManager = "chainedTx")
public void run() {
UUID uuid = UUID.randomUUID();
controlTemplate.send("private.s0869y.trx.model3a", "control: " + uuid);
payloadTemplate.send("private.s0869y.trx.model3b", "payload: " + uuid);
}
}
答案 0 :(得分:1)
它应该工作;你有$disable = '<script>document.getElementById("update_' . $r[0] .'").disabled = true;</script>';
print '<input id="update_' . $r[0] . '" type="text"/>';
print $disable
吗?
但是,交易不能跨越两个不同的生产者;您必须使用相同的模板进行两次发送。否则是2个不同的交易。
编辑
这是一个Spring Boot应用程序的示例:
EDIT2
更新示例以显示通过@EnableTransactionManagement
使用本地事务。
executeInTransaction
和
@SpringBootApplication
public class So54865968Application {
public static void main(String[] args) {
SpringApplication.run(So54865968Application.class, args);
}
@Bean
public ApplicationRunner runner(Foo foo) {
return args -> {
foo.runInTx();
System.out.println("Committed 1");
foo.runInLocalTx();
System.out.println("Committed 2");
};
}
@Bean
public Foo foo(KafkaTemplate<String, Object> template) {
return new Foo(template);
}
@Bean
public Bar bar() {
return new Bar();
}
@Bean
public NewTopic topic1() {
return new NewTopic("so54865968-1", 1, (short) 1);
}
@Bean
public NewTopic topic2() {
return new NewTopic("so54865968-2", 1, (short) 1);
}
public static class Foo {
private final KafkaTemplate<String, Object> template;
public Foo(KafkaTemplate<String, Object> template) {
this.template = template;
}
@Transactional(transactionManager = "kafkaTransactionManager")
public void runInTx() throws InterruptedException {
this.template.send("so54865968-1", 42);
this.template.send("so54865968-2", "texttest");
System.out.println("Sent 2; waiting a few seconds to commit");
Thread.sleep(5_000);
}
public void runInLocalTx() throws InterruptedException {
this.template.executeInTransaction(t -> {
t.send("so54865968-1", 43);
t.send("so54865968-2", "texttest2");
System.out.println("Sent 2; waiting a few seconds to commit");
try {
Thread.sleep(5_000);
}
catch (InterruptedException e) {
Thread.currentThread().interrupt();
}
return true;
});
}
}
public static class Bar {
@KafkaListener(id = "foo", topics = { "so54865968-1", "so54865968-2" })
public void haandler(byte[] bytes) {
if (bytes.length == 4) {
ByteBuffer bb = ByteBuffer.wrap(bytes);
System.out.println("Received int " + bb.getInt());
}
else {
System.out.println("Received string " + new String(bytes));
}
}
}
}
和
spring.kafka.producer.transaction-id-prefix=tx-id
spring.kafka.producer.properties.value.serializer=com.example.CompositeSerializer
spring.kafka.consumer.enable-auto-commit=false
spring.kafka.consumer.auto-offset-reset=earliest
spring.kafka.consumer.properties.isolation.level=read_committed
spring.kafka.consumer.properties.value.deserializer=org.apache.kafka.common.serialization.ByteArrayDeserializer
和
public class CompositeSerializer implements Serializer<Object> {
private final StringSerializer stringSerializer = new StringSerializer();
private final IntegerSerializer intSerializer = new IntegerSerializer();
@Override
public void configure(Map<String, ?> configs, boolean isKey) {
}
@Override
public byte[] serialize(String topic, Object data) {
return data instanceof Integer ? intSerializer.serialize(topic, (Integer) data)
: stringSerializer.serialize(topic, (String) data);
}
@Override
public void close() {
}
}
停顿5秒后,两者都出现了。