f0x_mulder
2 months ago
723
7
General

How to prevent concurrent handling of requests sent by same user

Posted 2 months ago by f0x_mulder

I have following controller method that creates a new order if and only if the previous order was closed (opened order has status = 0, closed order has status = 1).

public function createOrder(Request $req){
	// some validation stuff
	$last_active = Orders::where('user_id', $this->user->id)->where('status', 0)->orderBy('id', 'desc')->first();
	if ($last_active){
		return ['status' => 'error'];
	}
	$order= Orders::create([
		'status' => 0
                // some details
        ]);
	return ['status' => 'success'];
}

This method is bound to particular route

Route::post('/create', '[email protected]');

Client makes an ajax requests to this route. Logic behind this is super simple: i want user to have only one active order at time, so user must do some actions to close previous order before creating new one. Following code works perfectly in case of normal user, but it does not in case of user that wants to harm my application. So here is the problem. When user sends tons of such requests per second (i do it just within Google Chrome dev console with following script)

for (var i = 0; i < 20; i++) 
	setTimeout(function(){
		$.ajax({
      			url : '/create',
      			type : 'post', 
			success: function(d){
				console.log(d)
			}
		})
}, 1);

It causes multiple records with status=0 inserted into database when expected only one to be inserted, and others should not. IMO, what happens is:

  1. Many requests touch webserver (nginx in my case)
  2. Webserver creates many PHP processes (via php-fpm in my case)
  3. Multiple PHP processes run method at the same time, passing if ($last_active){...} check simultaneously before some record is inserted in another process, thus resulting in multiple records inserted.

What i tried to fix this:

  1. On nginx side, i limit request rate (10 r/s). This does not help a lot, because it still allows to send 10 requests very fast, with super small delay between them, before rejecting them. I can not set rate limit value lower than 10 r/s because it will hurt normal users
  2. On laravel side, i tried to make a transaction
public function createOrder(Request $req){
	// some validation stuff
	DB::beginTransaction();
	try{
		$last_active = Orders::where('user_id', $this->user->id)->where('status', 0)->orderBy('id', 'desc')->first();
		if ($last_active){
			DB::rollBack();  // i dont think i even need this
			return ['status' => 'error'];
		}
		$order= Orders::create([
			'status' => 0
                	// some details
        	]);
		DB::commit();
	}
	catch (\Exception $e){
		DB::rollBack();
		return ['status' => 'error'];
	}
	return ['status' => 'success'];
}

Using transaction significantly reduces number of rows inserted (and often even works as intended - allowing only 1 row to be inserted, but not always). 3. I also tried something odd:

public function createOrder(Request $req){
	if (Cache::has('action.' . $this->user->id)) return ['status' => 'error'];
    	Cache::put('action.' . $this->user->id, '', 0.5);
	// some validation stuff
	$last_active = Orders::where('user_id', $this->user->id)->where('status', 0)->orderBy('id', 'desc')->first();
	if ($last_active){
		Cache::forget('action.' . $this->user->id);
		return ['status' => 'error'];
	}
	$order= Orders::create([
		'status' => 0
                // some details
        ]);
	Cache::forget('action.' . $this->user->id);
	return ['status' => 'success'];
}

This kind of works in many cases, especially combined with transaction, but sometimes it still allows up to 2 rows to be inserted (in 1-2 cases out of 30). And also it does look weird for me. I thought about queues, but as laravel doc states they are intented for time consuming tasks. Also i thought about table locking, but is also seems weird and perfomance affecting for normal users. I believe that there exist clean and simple solution for this problem, but i can not find nothing reasonable in google, maybe i am missing something super obvious? Can you help please? Also, there are so many similar cases in my application that i really want to find some general solution for situations where concurrent execution causes such bugs not only with database, but also sessions, cache, redis, etc.

Please sign in or create an account to participate in this conversation.