我有流程要在文件管理器中实现JdbcMetadataStore,我使用了SimpleMetadataStore(),但这引起了问题,因为它在内存中,我需要使用共享的元数据存储,所以我安装了Postgres DB,可以看到Jdbc支持它,我在每个文档中声明了一个bean以返回JdbcMetadataStore,但是我不确定如何在过滤器中使用它,试图搜索很多示例,但找不到一个,请注意,我正在使用FileSystemPersistentAcceptOnceFileListFilter以及Postgres的数据源都在我的应用程序属性中设置。我已经在这里粘贴了我的代码,任何人都可以指导我如何前进吗?
private DataSource dataSource;
public IntegrationFlow localToFtpFlow(Branch myBranch){
return IntegrationFlows.from(Files.inboundAdapter(new File(myBranch.getBranchCode()))
.filter(new ChainFileListFilter<File>()
.addFilter(new RegexPatternFileListFilter("final" + myBranch.getBranchCode() +".csv"))
.addFilter(new FileSystemPersistentAcceptOnceFileListFilter(new SimpleMetadataStore(), "foo"))),
e -> e.poller(Pollers.fixedDelay(10_000)))
.transform( p ->{
LOG1.info("Sending file " + p + " to FTP branch " + myBranch.getBranchCode());
return p;
})
.log()
.handle(Ftp.outboundAdapter(createNewFtpSessionFactory(myBranch),FileExistsMode.REPLACE)
.useTemporaryFileName(true)
.autoCreateDirectory(false)
.remoteDirectory(myBranch.getFolderPath()))
.get();
}
public DefaultFtpSessionFactory createNewFtpSessionFactory(Branch branch){
final DefaultFtpSessionFactory factory = new DefaultFtpSessionFactory();
factory.setHost(branch.getHost());
factory.setUsername(branch.getUsern());
factory.setPort(branch.getFtpPort());
factory.setPassword(branch.getPassword());
return factory;
}
@Bean
public MetadataStore metadataStore(final DataSource dataSource) {
return new JdbcMetadataStore(dataSource);
}
在我手动创建表之前出现错误,但由于支持数据库,因此不应在Postgres中自动创建该表:
org.springframework.messaging.MessagingException: nested exception is org.springframework.jdbc.BadSqlGrammarException: PreparedStatementCallback; bad SQL grammar [INSERT INTO INT_METADATA_STORE(METADATA_KEY, METADATA_VALUE, REGION) SELECT ?, ?, ? FROM INT_METADATA_STORE WHERE METADATA_KEY=? AND REGION=? HAVING COUNT(*)=0]; nested exception is org.postgresql.util.PSQLException: ERROR: relation "int_metadata_store" does not exist
在为第二个服务器添加另一个流时记录有关该问题的信息,它将触发第一个流处理方法并将这两个数据发送到ftp服务器:
Saved Branch : BEY
Hibernate: select branch0_._id as _id1_0_0_, branch0_.branch_code as branch_c2_0_0_, branch0_.folder_path as folder_p3_0_0_, branch0_.ftp_port as ftp_port4_0_0_, branch0_.host as host5_0_0_, branch0_.password as password6_0_0_, branch0_.usern as usern7_0_0_ from branch branch0_ where branch0_._id=?
BEY
2019-01-07 15:11:25.816 INFO 12940 --- [nio-8081-exec-5] o.s.integration.channel.DirectChannel : Channel 'application.intermediateChannel' has 2 subscriber(s).
2019-01-07 15:11:25.817 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : started 1.org.springframework.integration.config.ConsumerEndpointFactoryBean#1
2019-01-07 15:11:25.817 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : Adding {transformer} as a subscriber to the '1.channel#0' channel
2019-01-07 15:11:25.817 INFO 12940 --- [nio-8081-exec-5] o.s.integration.channel.DirectChannel : Channel 'application.1.channel#0' has 1 subscriber(s).
2019-01-07 15:11:25.817 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : started 1.org.springframework.integration.config.ConsumerEndpointFactoryBean#0
2019-01-07 15:11:25.829 INFO 12940 --- [nio-8081-exec-5] o.s.i.e.SourcePollingChannelAdapter : started stockInboundPoller
BEY
2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : Adding {message-handler} as a subscriber to the '1o.channel#2' channel
2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.integration.channel.DirectChannel : Channel 'application.1o.channel#2' has 1 subscriber(s).
2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : started 1o.org.springframework.integration.config.ConsumerEndpointFactoryBean#1
2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : Adding {transformer} as a subscriber to the '1o.channel#0' channel
2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.integration.channel.DirectChannel : Channel 'application.1o.channel#0' has 1 subscriber(s).
2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : started 1o.org.springframework.integration.config.ConsumerEndpointFactoryBean#0
2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.i.e.SourcePollingChannelAdapter : started 1o.org.springframework.integration.config.SourcePollingChannelAdapterFactoryBean#0
2019-01-07 15:11:42.655 INFO 12940 --- [ask-scheduler-4] o.s.integration.ftp.session.FtpSession : File has been successfully transferred from: /ftp/erbranch/EDMS/FEFO/FEFOexportBEY.csv
Hibernate: select branch0_._id as _id1_0_, branch0_.branch_code as branch_c2_0_, branch0_.folder_path as folder_p3_0_, branch0_.ftp_port as ftp_port4_0_, branch0_.host as host5_0_, branch0_.password as password6_0_, branch0_.usern as usern7_0_ from branch branch0_
Hibernate: insert into branch (branch_code, folder_path, ftp_port, host, password, usern) values (?, ?, ?, ?, ?, ?)
Hibernate: select currval('branch__id_seq')
Saved Branch : JNB
Hibernate: select branch0_._id as _id1_0_0_, branch0_.branch_code as branch_c2_0_0_, branch0_.folder_path as folder_p3_0_0_, branch0_.ftp_port as ftp_port4_0_0_, branch0_.host as host5_0_0_, branch0_.password as password6_0_0_, branch0_.usern as usern7_0_0_ from branch branch0_ where branch0_._id=?
JNB
2019-01-07 15:13:36.099 INFO 12940 --- [nio-8081-exec-7] o.s.integration.channel.DirectChannel : Channel 'application.intermediateChannel' has 3 subscriber(s).
2019-01-07 15:13:36.099 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : started 2.org.springframework.integration.config.ConsumerEndpointFactoryBean#1
2019-01-07 15:13:36.099 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : Adding {transformer} as a subscriber to the '2.channel#0' channel
2019-01-07 15:13:36.099 INFO 12940 --- [nio-8081-exec-7] o.s.integration.channel.DirectChannel : Channel 'application.2.channel#0' has 1 subscriber(s).
2019-01-07 15:13:36.099 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : started 2.org.springframework.integration.config.ConsumerEndpointFactoryBean#0
2019-01-07 15:13:36.099 INFO 12940 --- [nio-8081-exec-7] o.s.i.e.SourcePollingChannelAdapter : started stockInboundPoller
JNB
2019-01-07 15:13:36.130 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : Adding {message-handler} as a subscriber to the '2o.channel#2' channel
2019-01-07 15:13:36.135 INFO 12940 --- [nio-8081-exec-7] o.s.integration.channel.DirectChannel : Channel 'application.2o.channel#2' has 1 subscriber(s).
2019-01-07 15:13:36.135 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : started 2o.org.springframework.integration.config.ConsumerEndpointFactoryBean#1
2019-01-07 15:13:36.135 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : Adding {transformer} as a subscriber to the '2o.channel#0' channel
2019-01-07 15:13:36.135 INFO 12940 --- [nio-8081-exec-7] o.s.integration.channel.DirectChannel : Channel 'application.2o.channel#0' has 1 subscriber(s).
2019-01-07 15:13:36.135 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : started 2o.org.springframework.integration.config.ConsumerEndpointFactoryBean#0
2019-01-07 15:13:36.135 INFO 12940 --- [nio-8081-exec-7] o.s.i.e.SourcePollingChannelAdapter : started 2o.org.springframework.integration.config.SourcePollingChannelAdapterFactoryBean#0
2019-01-07 15:13:40.981 INFO 12940 --- [ask-scheduler-1] o.s.integration.ftp.session.FtpSession : File has been successfully transferred from: /ftp/erbranch/EDMS/FEFO/FEFOexportJNB.csv
2019-01-07 15:13:46.085 INFO 12940 --- [ask-scheduler-7] o.s.i.file.FileReadingMessageSource : Created message: [GenericMessage [payload=BEY\finalBEY.csv, headers={file_originalFile=BEY\finalBEY.csv, id=42a97889-7bfb-8f77-75d8-4e7988a368f9, file_name=finalBEY.csv, file_relativePath=finalBEY.csv, timestamp=1546866826085}]]
2019-01-07 15:13:46.086 INFO 12940 --- [ask-scheduler-7] o.s.integration.handler.LoggingHandler : GenericMessage [payload=BEY\finalBEY.csv, headers={file_originalFile=BEY\finalBEY.csv, id=108a92b0-db42-620e-1c46-90652a071220, file_name=finalBEY.csv, file_relativePath=finalBEY.csv, timestamp=1546866826086}]
2019-01-07 15:13:46.160 INFO 12940 --- [ask-scheduler-8] o.s.i.file.FileReadingMessageSource : Created message: [GenericMessage [payload=JNB\finalJNB.csv, headers={file_originalFile=JNB\finalJNB.csv, id=d3b2c6a0-2e9c-42a8-c224-0ed9cbbfaabb, file_name=finalJNB.csv, file_relativePath=finalJNB.csv, timestamp=1546866826160}]]
2019-01-07 15:13:46.161 INFO 12940 --- [ask-scheduler-8] o.s.integration.handler.LoggingHandler : GenericMessage [payload=JNB\finalJNB.csv, headers={file_originalFile=JNB\finalJNB.csv, id=e34070c2-e6ff-e5e1-8c64-4af697ab1032, file_name=finalJNB.csv, file_relativePath=finalJNB.csv, timestamp=1546866826161}]
2019-01-07 15:13:47.129 INFO 12940 --- [ask-scheduler-7] o.s.integration.ftp.session.FtpSession : File has been successfully transferred to: /ftp/erbranch/EDMS/FEFO/finalBEY.csv.writing
2019-01-07 15:13:47.534 INFO 12940 --- [ask-scheduler-7] o.s.integration.ftp.session.FtpSession : File has been successfully renamed from: /ftp/erbranch/EDMS/FEFO/finalBEY.csv.writing to /ftp/erbranch/EDMS/FEFO/finalBEY.csv
2019-01-07 15:13:49.772 INFO 12940 --- [ask-scheduler-8] o.s.integration.ftp.session.FtpSession : File has been successfully transferred to: /ftp/erbranch/EDMS/FEFO/finalJNB.csv.writing
2019-01-07 15:13:50.757 INFO 12940 --- [ask-scheduler-8] o.s.integration.ftp.session.FtpSession : File has been successfully renamed from: /ftp/erbranch/EDMS/FEFO/finalJNB.csv.writing to /ftp/erbranch/EDMS/FEFO/finalJNB.csv
您可以在此处找到我的应用 https://github.com/EliasKhattar/Spring-Integration-Project/tree/master/spring4ftpappftp
答案 0 :(得分:0)
只需在过滤器构造函数中使用metadataSource(dataSource)
而不是new SimpleMetadataStore()
。
编辑
我只是将您的流程复制到一个新的应用程序中(做了一些更改,但没有对过滤器进行更改),一切对我来说都很好...
@SpringBootApplication
public class So54039852Application {
public static void main(String[] args) {
SpringApplication.run(So54039852Application.class, args);
}
@Bean
public IntegrationFlow localToFtpFlow(DataSource dataSource) {
return IntegrationFlows.from(Files.inboundAdapter(new File("/tmp/foo"))
.filter(new ChainFileListFilter<File>()
.addFilter(new RegexPatternFileListFilter(".*\\.csv"))
.addFilter(new FileSystemPersistentAcceptOnceFileListFilter(metadataStore(dataSource), "foo"))),
e -> e.poller(Pollers.fixedDelay(10_000)))
.log()
.get();
}
@Bean
public ConcurrentMetadataStore metadataStore(final DataSource dataSource) {
return new JdbcMetadataStore(dataSource);
}
}
和
$ touch /tmp/foo/foo.csv
...
$ touch /tmp/foo/bar.csv
和
2019-01-09 12:46:26.332 INFO 43329 --- [ask-scheduler-2] o.s.integration.handler.LoggingHandler : GenericMessage [payload=/tmp/foo/foo.csv, headers={file_originalFile=/tmp/foo/foo.csv, id=e0613529-a657-fbd3-5e67-8bb53a58b5ca, file_name=foo.csv, file_relativePath=foo.csv, timestamp=1547055986330}]
2019-01-09 12:47:26.487 INFO 43329 --- [ask-scheduler-5] o.s.integration.handler.LoggingHandler : GenericMessage [payload=/tmp/foo/bar.csv, headers={file_originalFile=/tmp/foo/bar.csv, id=4feb74b6-d711-f028-70c7-83cdfcd0aeec, file_name=bar.csv, file_relativePath=bar.csv, timestamp=1547056046487}]
和
mysql> select * from INT_METADATA_STORE;
+---------------------+----------------+---------+
| METADATA_KEY | METADATA_VALUE | REGION |
+---------------------+----------------+---------+
| foo/tmp/foo/bar.csv | 1547056039000 | DEFAULT |
| foo/tmp/foo/foo.csv | 1547055980000 | DEFAULT |
+---------------------+----------------+---------+
2 rows in set (0.00 sec)
即使重新启动后,我也看不到文件,但是如果我更改其中一个文件的日期...
$ touch /tmp/foo/bar.csv
和
2019-01-09 12:51:58.534 INFO 44430 --- [ask-scheduler-2] o.s.integration.handler.LoggingHandler : GenericMessage [payload=/tmp/foo/bar.csv, headers={file_originalFile=/tmp/foo/bar.csv, id=f92d6b36-c948-37cc-ca56-9ef28de336f2, file_name=bar.csv, file_relativePath=bar.csv, timestamp=1547056318532}]