This is a very broad topic and I'm kinda looking for advice.
I have an API and transfer model with assiociations
Transfer
attributes :start_at, :end_at
belongs_to :start_location
belongs_to :end_location
belongs_to :person
On the frontend I want to display a paginated list with all transfers.
Start at | End at | From | To | Person
16:12 | 16:30 | Street1 | Street2 | John Doe
18:45 | 19:12 | StreetA | StreetB | Jane Doe
I go to page 1, fetch record with associations and display it, i go to page 2... page 3 etc.
It's simple and works perfectly fine.
But I can also do it differently.
When i go to page 1, instead of fetching data, i fetch only ids. So my json looks something like this
{
"transfers": [
{ "id": 1, "person_id": 44, "start_location_id": 152, "end_location_id": 156 }
{ "id": 1, "person_id": 44, "start_location_id": 152, "end_location_id": 156 }
(...)
]
}
In the next step I fetch all the data by given ids and save it in some global store
Now when im paginating i always get the ids on each page, but instead of fetching the data over and over on each page i first hit my store and check if i can grab it from there. Obviously sometimes i need to lazy load more data, also i will need to use websockets to update stored records. That's a lot more work actually.
I don't have any requirments from my client. That's just something I was thinking about recently.
This solution can potentially save some bandwidth but my question is... Is it worth it?
There are websites that serve thousands of requests, rendering html every time so should i really care?
Are you guys doing something similar?
THANK YOU!
TL;DR What are frontend data caching strategies that you are using and is it worth the effort?