我正在构建一个应用程序,将CSV的内容解析为多个“读数”。然后,它会将这些读数发布到REST API服务器,并将其添加到数据库中。
到目前为止一切顺利(上面的工作正常)。但是,我意识到服务器偶尔会出现连接问题(延迟,或者如果API由于任何原因而关闭)。
为了解决这个问题,我计划使用Sidekiq异步执行传输。但是,我有几个问题:
我最好创建一个ActiveJob来执行帖子,然后在解析CSV时只排队这个工作?我想,如果存在连接问题,它最终会按照添加的顺序恢复吗?
我是否需要告诉Sidekiq“重试”这项工作,或者这会自动发生?
最后,由于Sidekiq使用Redis,应该启用持久性,以便如果应用服务器在队列中有项目时崩溃,它们不会丢失吗?
还有什么需要考虑的吗?或者更好的解决方法?
答案 0 :(得分:0)
There are two scenarios
1) you are going to put the complete processing of the CSV file in on Job, in which case, I think it will not work perfectly as the if the job fails it will again loop over all the rows. unless you mark the rows by adding another column in the CSV say read.
2) You add each row as a separate Job in Sidekiq, I can only think one downside with this is that you may end up creating too many jobs, in case you have large CSV files, but it will save a lot of processing on CSV side.