@kokoshneta @snapey
Upserts and prepared statements are not possible with the scenario i'm working on.
Let me be more clear about this, perhaps you can point me to another direction that I'm not currently lloking at because of not knowing all the features from the framework.
I have a model of elements. When a customer opens a specific element from his/her account, Guzzle performs a GET request to an external API endpoint that gives me specifics about that item, such as max price, min price, amount sold, etc, and the model is updated with new data. The point is, not every customer has every item, and items are only getting updated when a customer specifically opens it, so, if no one opens item 434343, I cannot show updated information about it. So, CustomFacade::getElements() retrieve elements as a JSON from that external API with a list of all current elements in the API. Those elements are then compared to the database, and removed all database elements that are no longer in the JSON since last request (for example, elements that a user hasn't opened in a while and are no longer available in the API but still stored in the database). Those elements who are present in JSON remain in database, and for each of them I need to redo a request to another endpoint of the same API to get specifics of each element that are not present in the previous request (first request gives me only the list of elements, second request gives me max price, min price, amount sold in last 24 hours, max items in same order, for the specific element I'm asking for in the request). After retrieving that data, I compare it to the database for updated_at > 72hs or current ammount sold in the last 24hs > 0 and > last amount sold 24hs, current max price > last max price, current min price < last min price, etc. All those verifications are done before updating/inserting the model, because if no condition is true, then changes are not needed and model is not updated. The CustomFacade allows me to update/delete elements even when users don't open them, and keep my database relatively up-to-date with the external API information. For example, if user 38 opened item 1499 yesterday, and I run my job on schedule every monday, then item 1499 wont be updated because it was updated less than 72 hours ago.
So, basically, I need to retrieve both database records, API data, compare them, remove unused, request more api data, compare to the remaining local data, update only if necessary, and the save.
I am thinking about splitting the elements in batches of around 50k of items daily, but I don't know where it will be better to iterate over those 50k elements, send the full array to the job and iterate inside, or iterate outside the job and create a job for each one of the 50k elements.
Thanks for your comments, this question perhaps sounds basic, but I'm quite new to the framework, and it has so many tools that it is hard for a newbie to find the specific tool to get things done in the best way possible.