Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

Adailton de Jesus Cerqueira Junior's avatar

Are there limit how many rows Laravel can read?

I have a question about work with a large xlsx file. Are there limit how many rows Laravel can read?

I have file with more than 800k rows and even using Chunk, I get an out of memory or timeout error.

0 likes
9 replies
Braunson's avatar

Generally it's up to your server hardware specs. You just need to either optimize the code/query or get more memory/change PHP config.

1 like
JussiMannisto's avatar

Are you processing the Excel file in a background job/command, or are you trying to do it during a request? Which Excel package are you using?

JussiMannisto's avatar

@Adailton de Jesus Cerqueira Junior PhpSpreadsheet holds the whole document in memory which takes a lot of RAM. If you can't solve the memory issues, you might want to check this package. It's no longer actively maintained but it should work.

1 like
JussiMannisto's avatar

@Adailton de Jesus Cerqueira Junior I don't think the code will expire soon, but it's up to you to evaluate if it's safe. I used that library at one point and it was really memory efficient compared to others I could find.

But it didn't do everything I wanted, so I wrote my own Excel parser that I've been using for the last couple of years. Excel files (and .ods) are just a bunch of XML files zipped together. If you're only interested in the data, it can be read very efficiently.

1 like
hupp's avatar

@adailton de jesus cerqueira junior Here is the very famous package that works very well with excel. You should try it once. and if in chunks you are getting memory error than you have to decrease the chunk size or increase the server configuration. you have to do additional change config of server timeout. Let me know if you face any issue. For Excel Import-Export Package: https://docs.laravel-excel.com/3.1/imports/chunk-reading.html

1 like
nexxai's avatar

The only physical limit is RAM, so if you're maxing out the RAM in the server doing the processing, do what @hupp suggested and chunk the reads into more manageable pieces.

1 like

Please or to participate in this conversation.