I am currently having a relatively huge batch logic analyzing transactions and purchase data, and applying dynamic pricing accordingly. It runs twice daily and takes around 2 hours to run. I am confused which AWS service I should use to deploy this logic.
I have considered and explored multiple options and got to the following:
- lambda functions: a good and easy to set up option yet limited to 15 mins runtime --> hence excluded.
- Step functions: I can divide my logic into multiple dependent steps, in my case each calling a lambda function doing part of the logic and building a workflow.
- ECS with fargate: I can containerize my whole logic as a whole in one image on ECR and define a task on ECS and schedule it to run as desired, while fargate provisions the needed resources and manages the infrastructure and auto-scaling.
- AWS Batch: managing my containerized logic with Batch jobs running it on the desired schedule. I am confused whether to go to step functions, ECS, or Batch. Which should work better in my use case in terms of efficiency, pricing, ease of setup and complexity. And generally when to go to each or even mix some of them together. Would appreciate your help.
P.S: I have also tried automating the script through eventbridge rules by uploading it to S3 and triggering eventbridge rule at certain schedule to run it, yet I am having dependency issues with certain libraries am using so that's why I considered containerization.