@WR1 I completely understand. We load 90K records a day and have real-time calculations running on the back of that which required a lot of CPU/memory. My gut is your data footprint will be smaller but to get an accurate idea of your cost you will want to build a prototype. Testing the ingestion and any calculation processes will give you some idea of the daily cost. There is no question that building w/ code would be cheaper on the infra side.