How to Fix Race Conditions in Laravel
Hey there! So, race conditions—they can be a real pain, right? If you’ve dealt with them before, you know how unpredictable they can make your app, especially when things start getting busy. Don’t worry, though; I’m going to walk you through what a race condition is, why it happens, and how we solved it in one of our Laravel projects.
What’s a Race Condition, Anyway?
A race condition pops up when two or more processes are fighting to use the same resource at the same time, and the result depends on who gets there first. They’re tricky because the problem might not happen all the time—just when two requests hit at the same time and mess with the data flow.
In databases, this is even worse because it can lead to things like duplicate entries, data not saving correctly, or other unexpected bugs. It’s not easy to catch until something goes wrong in production.
Our Problem
We ran into a race condition recently in one of our Laravel apps, which generates short URLs from long ones. These short URLs are used for sending SMS notifications (because no one likes receiving an essay-length link in a text, right?). We used the ashallendesign/short-url package, and while it worked great for a while, things started failing once traffic spiked. Suddenly, we’d get an error saying:
Integrity constraint violation: 1062 Duplicate entry '5Tp2Dr' for key 'short_urls.short_urls_url_key_unique'
Okay, so what was going on here? Here’s a simplified version of what was happening:
class Builder{ protected function getLastInsertedID(): int { if ($lastInserted = ShortURL::latest()->select('id')->first()) { return $lastInserted->id; } return 0; } protected function generateRandom(): string { $ID = $this->getLastInsertedID(); do { $ID++; $key = $this->hashids->encode($ID); } while (ShortURL::where('url_key', $key)->exists()); return $key; } public function make(): ShortURL { $data = [ 'destination_url' => $this->destinationUrl, 'url_key' => $this->generateRandom(), // other attributes ]; return ShortURL::create($data); }} $builder = new Builder(); $shortURLObject = $builder->destinationUrl('https://laravel.com/docs')->make();$shortURL = $shortURLObject->default_short_url; // Short URL: https://webapp.com/short-key
At first glance, this looks fine. The code fetches the last inserted ID from the database, increments it, and uses that to generate a short URL key with Hashids
. But, when multiple requests hit at the same time, they both grab the same last inserted ID, generate the same short URL, and bam!—we have a duplicate entry error. Not great.
Retrying Until It Works
One simple way to fix this is to add a retry mechanism. Essentially, we’ll try to generate the URL again if something goes wrong. Think of it as your backup plan. If the first attempt fails, we’ll try again, up to three times. Here’s how we did it:
public function make(): ShortURL{ $maxRetries = 3; for ($retryCount = 1; $retryCount <= $maxRetries; $retryCount++) { try { $data = [ 'destination_url' => $this->destinationUrl, 'url_key' => $this->generateRandom(), ]; return ShortURL::create($data); } catch (\Illuminate\Database\QueryException $e) { if ($retryCount >= $maxRetries) { throw new ShortURLException("Failed to create a new short URL.", 0, $e); } } }}
This works, but it still doesn't address the underlying issue. The issue is that we check for key existence, but we create it later, which isn't atomic.
Solving It Right: Using Database Locks
To fix the problem, we needed to stop multiple requests from grabbing the same ID simultaneously. That’s where database-level locking comes in. It locks the record so only one process can update it at a time.
In Laravel, you’ve got two options for database locking: sharedLock()
and lockForUpdate()
.
sharedLock()
lets multiple processes read a record at once but blocks any changes until the lock’s released.lockForUpdate()
ensures that only one process can modify the record at a time, blocking others from reading or changing it until done.
In our case, we used lockForUpdate()
, which locks the record until the process is done updating it. The lock is automatically released when the database transaction completes, so you don’t need to worry about manually unlocking it.
Here’s what the updated code looks like:
class Builder{ protected function getLastInsertedID(): int { $lastInserted = ShortURL::select('id')->latest('id')->lockForUpdate()->first(); return $lastInserted ? $lastInserted->id : 0; } protected function generateRandom(): string { $ID = $this->getLastInsertedID(); do { $ID++; $key = $this->hashids->encode($ID); } while (ShortURL::where('url_key', $key)->exists()); return $key; } protected function retryTransaction(callable $callback): mixed { $startTime = microtime(true); $maxTime = 1_000_000; // 1 second $maxDelay = 50_000; // 50ms while (microtime(true) - $startTime <= $maxTime) { try { return DB::transaction($callback); } catch (Exception $e) { usleep(random_int(0, $maxDelay)); } } throw new ShortURLException("Transaction could not be completed in time."); } public function make(): ShortURL { return $this->retryTransaction(function() { $data = [ 'destination_url' => $this->destinationUrl, 'url_key' => $this->generateRandom(), ]; return ShortURL::create($data); }); }}
Wrapping It Up
In a nutshell, by using lockForUpdate()
, we’re ensuring that only one request can get the last inserted ID at a time. Once the transaction completes, the lock is automatically released, allowing other requests to proceed smoothly. Combined with a retry mechanism, this means we can safely generate unique short URLs even when traffic spikes.
The random usleep()
delay helps space out retries, making sure they don’t all hit at once. This reduces the chances of requests getting in each other’s way, which makes the whole process smoother.
And there you have it! A simple, reliable way to handle race conditions in your Laravel app. With this setup, your app will be more robust, and your users won’t run into those nasty duplicate errors. Until next time, happy coding! 👋