Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

Mfrancik's avatar

EXTREMELY slow server

I am using digital ocean and laravel forge. My server has 6vCPU and 16 gb of ram. I have two sites on forge tied to two separate droplets. One of my databases has aprox 8m records in one of the tables (i've deleted a few million thinking it would help). There are some processes that are heavy on the server, for example i allow the user to upload a csv of contact information (looping the CSV file and creating 'leads' with eloquent now creates about 100 records every 3 seconds). I have one page that returns the csv files the user has uploaded and prints them in a table. if the file is processing, blade querys the total current uploaded, but when pulling from just the uploads tables (returning 100 or so results), the page can take up to 30 seconds to load. I 6 workers in case the users of the platform all perform an action at once, very rare do i have more than 3 concurrent workers performing tasks.

Not only do any pages where a query is run very slow, but even my home pages do not load quickly, which pull nothing from the database.

When i look on digital ocean i notice there are times where my CPU gets maxed out to 100% (not sure if this is a normal process).

Im currently working on separating the two sites on different forge servers so i can atleast pinpoint which one is causing me issues, but its become quite a task moving the database (this should be complete tomorrow).

Although this is a bit vague, if somebody can point me in the right direction so i can begin troubleshooting, that would be excellent.

0 likes
11 replies
bobbybouwmann's avatar

It sounds to be that your workers (processing the csv) take a lot of CPU usage and therefore the whole server is getting slower because the CPU usage is high. Normally when you process this amount of data it's wise to split this into multiple servers. The most ideal setup might even be to have a separate web server, database server and worker server. This means you have three servers, but if your workers are having a hard time you can either scale that server or add another server that is also reading from the queue.

Separating the database into a different server will also take down the CPU usage in the end. Right now you write and read a lot from the database and this takes time and memory. If you have this on a separate server you will probably get some extra speed as well.

Does this help you in any way? Do you have more questions? Let me know!

1 like
Mfrancik's avatar

Thanks Bobby, I will split the program into different servers for my workers if needed. But even when no workers are running, and no files are being uploaded, its taking forever. I have one page for example:

$plan = Auth::user()->subscription()->stripe_plan;

        $uploads = Upload::where('user_id', Auth::user()->id)
            ->orderBy('created_at', 'DESC')
            ->where('created_at', '>=', Carbon::now()->subDays(14))
            ->paginate(25);

        $fileUploading = false;
        foreach($uploads as $upload){
            if($upload->uploading == true){
                $fileUploading = true;
            }
        }
        
        return view('uploads.index')
            ->with('plan', $plan)
            ->with('fileUploading', $fileUploading)
            ->with('uploads', $uploads);

This currently returns 30 uploads, one of which gets queried in blade ({{$uploads->leads()->count()} returning querying about 6k records. this page is taking over 30 seconds to load with no workers or uploads occurring right now.

Yorki's avatar

Do you have proper indexing? Try dumping these queries into sql string and run on database to see how much time it needs to process. From my experience these sometimes are really slow. You use paginate which does query also for count, think of simple paginate instead if you don't need exact page count. Lastly, but not least you should make use of cache.

1 like
Mfrancik's avatar

In this case the only indexed field would be user_id, which since is a foreign key gets indexed, correct? I have one user where the upload returns 1 record, even to load that it takes 20 seconds.

I will use simplepaginate instead and see if it helps any.

In my .blade.php file i have a lot of blade conditionals, could that drastically decrease performance?

click's avatar

Simple pagination helps A LOT if you have millions of records from my own experience. The count that is executed every time you fetch a page is killing the performance with the 'LengthAwarePaginator'. Millions of records in a table should not be a problem if your indexes are correctly set and you don't have any exotic queries.

I see you are sorting on 'created_at' do you have an index on that? In general you should have an index on each field you search and/or sort.

How does your website perform on your local machine?

Just something small, but your foreach can be 1 nanosecond 'quicker':

$fileUploading = false;
foreach($uploads as $upload){
      if($upload->uploading == true){
           $fileUploading = true;
           break; // no need to continue your foreach loop if you found one upload that is uploading 
     }
}
1 like
Mfrancik's avatar

ill add an index on created at. Locally the site performs well, it was sort of a sudden thing where the website started slowing down. I looked at the processes running on the server, and it doesnt seem anything out of the ordinary. On digital forge, I can see the even CPU usage isnt maxed out and response time is still extremely slow.

ill add break statement.

Locally site works just fine.

Mfrancik's avatar

currently i have two sites on the server, i tried moving the issue site off the current server and move to a completely separate one. i pulled the clockwork package and now whenever trying to pull to this same server i get the following:

Fetching origin
error: insufficient permission for adding an object to repository database .git/objects
fatal: failed to write object
fatal: unpack-objects failed
error: Could not fetch origin

is this referring to a permission issue from forge accessing github?

Mfrancik's avatar

for others:

fixed my using password to access github . ^^

Mfrancik's avatar

Locally is also quite slow. On Debug bar I see the queries that were ran (.75 seconds), but still took the page 20 seconds to load using 275mb of memeory

where do i see what is happing on the application side?

Booting (953.86ms) Application (18.63s)

Please or to participate in this conversation.