Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

Mfrancik's avatar

.csv upload

I allow the user to map the fields in the csv file to my database fields, i then loop through the file and create a record for each record with eloquent. It takes quite a bit of time but want to use eloquent over directly importing via sql.

thoughts? suggestions?

while(! feof($file))
                {
                    $lead_value = fgetcsv($file);
                    $lead = new Lead();
                    $nullCount = 0;
                    $failBool = false;
                    foreach ($fields as $key => $field_value) {
                        //lead validation
                        if($request["$field_value"] != "null" ){
                            if(is_array($lead_value) && array_key_exists((int)$request["$field_value"], $lead_value)){
                                if($field_value == "billing_state" || $field_value == "phone"){
                                    if($field_value == "billing_state" && strlen($lead_value[(int)$request["$field_value"]]) >= 2) {
                                        $lead["$field_value"] = $this->setStateAbbr($lead_value[(int)$request["$field_value"]]);
                                    }
                                    else{
                                        if($field_value == "phone" && (int)($lead_value[(int)$request["$field_value"]]) >= 10 && is_numeric($lead_value[(int)$request["$field_value"]]+0)){
                                            $lead["$field_value"] = $lead_value[(int)$request["$field_value"]];
                                        }
                                        else{
                                            $lead["$field_value"] = null;
                                            $failBool = true;
                                        }
                                    }
                                }
                                else $lead["$field_value"] = $lead_value[(int)$request["$field_value"]];
                            }
                            else{
                                $nullCount++;
                            }
                        }
                        else{
                            $nullCount++;
                        }
                    }
                    if(!$failBool){
                        $success++;
                        $lead->upload_id = $upload->id;
                        $lead->user_id = $user_id;
                        $lead->waiting_to_check = false;
                        if(isset($lead->phone) && is_numeric($lead->phone) && strlen($lead->phone) == 10) {
                            try {
                                $lead->save();
                            } catch (\Exception $e) {
                                //   echo($e);
                                $failure++;
                            }

                        }else{
                            $lead->delete();
                        }
                    }
                    else{
                        $failure++;
                    }
                }
0 likes
8 replies
jlrdw's avatar

If it works why not, at least you can do custom checks rather than a direct import. That's my thoughts.

Mfrancik's avatar

The purpose of the app it to upload phone numbers, then I make api requests to update relevant info on those phone numbers. Typically the users are going to upload files of 25k+ and I'd like to speed the process up if possible. Right now its creating ~300 leads every 7 seconds (im updating user every 7 seconds, this is why i know this), which is incredibly slow.

jlrdw's avatar

The purpose of the app it to upload phone numbers

I have never heard of this, what's going on with phone numbers that is legit?

Just curious. Every "dateline", "20 20" I see involving things with phone numbers are some sort of scam.

I don't like rif-raf.

But if legit, and dealing with large amounts of data, look at things like

  • load balancing
  • increase resources
  • Have custom app for users that readies the upload.

Example, many online backup services have apps to interact with their online storage. etc, etc.

Sorry, it just sounds fishy.

Mfrancik's avatar

its a compliance tool that tells you which phone numbers have requested NOT to be contacted, keeping the marketing firms compliant.

I will look at the three options you have listed, Thank you!

Mfrancik's avatar

so the code overall is very quick, but when i save() the model, this is what takes so long. its taking about .09 seconds per record, which on 50k records is VERRRYYY slow. Is there a way of I can speed this up?

Snapey's avatar

if you have any fields that are unique columns, make sure they are indexed. For instance you might be checking that you are not duplicating the phone number. indexing this column will significantly speed it up

1 like
Mfrancik's avatar

I dont do any duplicate checking as of the moment, but if i do create a unique field i will be sure to index it. Currently no fields in this table are index aside from the foreign keys by default.

Cronix's avatar
Cronix
Best Answer
Level 67

One way to speed it up is to do bulk inserts. Right now you're inserting each one individually. I'd insert 50 at a time (or more). You'd have to redo/restructure all of that in order to be able to do it though, but it would be a lot faster.

Lead::create([
    [array of lead1 data],
    [array of lead2 data],
    // etc
]);

Please or to participate in this conversation.