Chunk or paginate, but you would be better just retrieving the results needed only and not that many at one time, if that's possible.
Fetch 400000 row got 500 error.
I just start learn web dev with laravel. and this my 1st project.
I have a scores table with 400000+ row, I try to fetch it by Score::all() but it get 500 error. I takes about 600ms to get the error.
how i suppose to fetch big mysql row in laravel?
What do you hope to do with that much data in memory?
here what i want to achieve:
I have 3 model: ScoreType; Portfolio;Score
There is 10 ScoreType: A-J
Each Year There Is About 100 portfolio, each portfolio have about 1000 scores
Each score has scoretype which can be A-J
I need to multiply weight and score on each score then make average where scoretype is same and score->portfolio->year is same.
Ex:
scoretype = [
['id'=> 1, 'desc' => 'some desc', 'type' => A],
['id'=> 2, 'desc' => 'some desc', 'type' => B],
.......
['id'=> 10, 'desc' => 'some desc', 'type' => J],
]
portfolio = [
['id'=> 1, 'desc' => 'some desc', 'year' => 2017],
['id'=> 2, 'desc' => 'some desc', 'year' => 2017],
['id'=> 3, 'desc' => 'some desc', 'year' => 2017],
.......
['id'=> 100, 'desc' => 'some desc', 'year' => 2018],
.......
['id'=> 300, 'desc' => 'some desc', 'year' => 2019],
]
score = [
['id'=> 1, 'score' => 90, 'weight' => 0.5, 'portfolio_id' => 1, 'scoretype_id' = 1],
['id'=> 1, 'score' => 30, 'weight' => 0.1, 'portfolio_id' => 1, 'scoretype_id' = 1],
['id'=> 1, 'score' => 20, 'weight' => 0.2, 'portfolio_id' => 1, 'scoretype_id' = 1],
['id'=> 1, 'score' => 80, 'weight' => 0.3, 'portfolio_id' => 1, 'scoretype_id' = 2],
['id'=> 1, 'score' => 70, 'weight' => 0.6, 'portfolio_id' => 1, 'scoretype_id' = 3],
['id'=> 1, 'score' => 80, 'weight' => 0.4, 'portfolio_id' => 1, 'scoretype_id' = 10],
......
['id'=> 299000, 'score' => 80, 'weight' => 0.4, 'portfolio_id' => 300, 'scoretype_id' = 8],
['id'=> 300000, 'score' => 70, 'weight' => 0.6, 'portfolio_id' => 300, 'scoretype_id' = 5],
]
I need to get data Ex=
ScoreTypePerformance = [
A => [2017 =>30, 2018=> 40, 2019 => 10],
B => [2017 =>40, 2018=> 60, 2019 => 20],
C => [2017 =>30, 2018=> 50, 2019 => 40],
........
D => [2017 =>20, 2018=> 40, 2019 => 60],
]
I've working code for small amount of score (5000+) but when score has 200000+ row i got error. Here how my code.
- I fetch portfolio and group by year to get year list
yearlist = [2017, 2018, ...]
- then loop the scoretype and year list then fetch score based on scoretype and year
data = [];
foreach ($scoretype as $type) {
foreach ($yearlist as $year) {
$scores = Score::where('scoretype_id', '=', $type->id)->whereHas('portfolio', function($query) use ($year){
$query->where(['year', '=', $year]);
})->get();
$sum = 0;
$weight = 0;
foreach ($scores as $score) {
$sum += ($score->score * $score->weight);
$weight += ($score->weight);
}
$data[$type->id][$year] = $weight ? round($sum / $weight, 2) : 0;
}
}
Check your error log (/storage/logs/laravel.log). It's probable that you ran out of memory. If you have a lot of entries in your log file, delete it, run the request again and check the log so it will only have errors for that 1 request.
If this is for a large report type thing only, I would seriously consider dumping the data to local machine and I used to use visual Foxpro to further manipulate the data and do reports but visual Foxpro is no longer available but Ms Access would be a good alternative. Just set some stored procedures to dump the data overnight.
use chunk to process the rows in batches
Please or to participate in this conversation.