I am looking at helping a company do a demo at a tradeshow by making them a website that displays a set of stats thats constantly being updated. They are concerned with running the site on their servers since they have some much going on with their current user load through various products. The thought is:
My Laravel site does an API call to their server every 30 seconds.
If their server returns json with a time stamp on when it was updated
If its new data I can save it in my DB I display the data from my DB
The site it self is very basic. 2 images and a table of the data.
I am using Laravel and Vue to handle everything.
Is it best to save the data in a DB or just as a JSON file in the app? We are looking at 2k concurrent users for a few days. My original thought was a 16 GB 6 vCPUs 320 GB 6 TB server from DigitalOcean.
Does this seem enough? Is storing in the DB or as a file lower risk of the server failing?
Sounds dramatically over-engineered for the number of users.
The number of users is somewhat academic since it depends more how many requests you will serve per minute / per second (depending on how you want to manage it).
If all those users are seeing the same view, then use pusher or laravel-websockets to broadcast updated data, in which case you serve one single request every 30 seconds, broadcast to all users, for which any base level VPS will be sufficient.