我要求触发流的开始,以将客户数据从SQL数据库加载到Mongo DB数据库中。所有当前交易都需要移至Mongo DB。一旦完成当前交易到MongoDB的迁移,就会开始监视新的交易并将其复制到MongoDB。目标是在所有当前客户交易移至MongoDB后触发监控新客户交易的流程。问题是移动当前客户交易是从属性文件中的属性触发的。
<flow name="migrateCurrentTrades">
<quartz:inbound-endpoint responseTimeout="10000" doc:name="Task"
cronExpression="0 */2 * * * ?" jobName="mailJob"
repeatInterval="1000000"
repeatCount="0">
<quartz:event-generator-job/>
</quartz:inbound-endpoint>
<flow-ref name="readAndSave">
<!-- This actually is a reference to a groovy script that starts the
monitorCustomerActivity flow below in the 'stopped'
initialState
-->
<script-ref name="startMonitorCustomerActivityFlow"/>
</flow>
<!-- long running job----->
<sub-flow name="readAndSave">
<db:select config-ref="mySQLConfig">
<db:parameterized-query><![CDATA[
SELECT * FROM CUSTOMER c WHERE status='ACTIVE'
]]>
</db:parameterized-query>
</db:select>
<custom-transformer class="com.gdc.CustomerTransformer" />
<!-- Save to Mongo DB -- via queue -->
<vm:outbound-endpoint path="mongodb-queue" exchange-pattern="one-way"/>
</sub-flow>
<flow name="monitorCustomerActivity" initialState="stopped">
<quartz:inbound-endpoint responseTimeout="10000" doc:name="Task"
cronExpression="0 */45 * * * ?" jobName="mailJob" repeatInterval="0"
repeatCount="0">
<quartz:event-generator-job/>
</quartz:inbound-endpoint>
<db:parameterized-query><![CDATA[
SELECT * FROM CUSTOMER_TRADES c WHERE trade_status='NEW'
]]>
</db:parameterized-query>
</db:select>
<custom-transformer class="com.gdc.TradesTransformer" />
<!-- Save to Mongo DB -- via queue -->
<vm:outbound-endpoint path="mongodb-queue" exchange-pattern="one-way"/>
</flow>
不幸的是,readAndSave流程一直在重复运行,因为它需要很长时间才能完成。我已将repeatInterval设置为一个非常高的值,并将repeatCount设置为0.我希望readAndSave流只被触发一次并完成它。虽然它启动了第二个流,monitorCustomerActivity,但它开始与它交错导致错误。如何解决此问题,确保只调用一次readAndSave流并在调用第二个流量监视器客户端活动之前完成保存当前客户交易?我已经和问题斗争了好几天
答案 0 :(得分:0)
尝试以下步骤,增加超时,repeatCount = 0 + startdelay(停止并启动你的anypoint /服务器,它将等待20秒,然后开始计划)。
由于它是Async,您可以使用flow而不是子流。
<quartz:inbound-endpoint responseTimeout="60000" doc:name="Task"
jobName="mailJob" repeatInterval="1000000" repeatCount="0" startDelay="20000"> <quartz:event-generator-job/>
</quartz:inbound-endpoint>
等....
make readAndsave subflow to `flow`
<flow name="readAndSave">
<db:select config-ref="mySQLConfig" doc:name="Database">
<db:parameterized-query><![CDATA[
SELECT * FROM CUSTOMER c WHERE status='ACTIVE'
]]>
</db:parameterized-query>
</db:select>
<logger message="readAndSave:Check how many times flow gets trigger by looking at this log entry" />
etc...