Our web site allows administrators to select file source and destinations for syncing. The source will be a SFTP server and the destination will be an AWS S3 bucket. The web site will create queued jobs in SQL server that will, at some point, be picked up by the service.
My quandary is that there could be 10 + scheduled jobs with 250,000 or more files. Not only do I need to copy the files, I also need to parse a manifest file that contains meta information for each file; to be stored in SQL. The files will be relatively small, 25Mb or less in most cases. I want to avoid the service constantly running and never being able to "catch up"
I'm thinking of creating a new thread for each job. No 2 jobs will be writing to the same location, so I don't think disk thrashing will be an issue. I'd appreciate and helpful advice on the best approach.
Also, I'm using Web Api because we are exposing other related functionality to customers and internal applications.