Processing Transactions without Duplicate with Laravel

Learn how to avoid unprocessed Laravel jobs and make your application faster and easier to maintain

As a software developer, working in a startup comes with a lot of stress and excitement, you get to experience new things and deal with problems you never thought existed. This becomes more interesting and fun when you work in a Fintech startup that processes both fiat and crypto payments.

The crypto space or web3 as software developers like to say is still a new path and there are a lot of uncharted territories and shocks that are yet to be discovered.

The developers in my company had a rather shocking experience during the course of the week when a customer lodged a complaint that the bitcoin that was transferred into his wallet was not showing. Now, we had never had this sort of complaint prior to this time as our system is well designed and processes thousands of transactions per week, both fiat and crypto without downtime or loss in customers' funds.

Let me give you a glimpse into how the system is designed. Everything that has to do with payment is triggered by an event and made private and these events are in turn processed by jobs such that they are given priorities either sync queues or delayed queues. This enables the customers to know when to expect their funds either instantly or delayed depending on the type of transaction made or the traffic on the system. With all these in place, everyone is assured as to the delivery of their funds, both customers and developers.

It was therefore no surprise that the support team asked the customer to wait a little while for the fund to appear in the customer's wallet when a complaint was lodged. After waiting for some moments the customer reached out again and this time the customer was asked to provide evidence of the transaction. The evidence was provided and the complaint was passed to the tech team.

True to his claim, bitcoin was transferred to his wallet but it was not reflected on the system. We ran tests on the system and everything passed, and no other customer had the same complaint. The supposed funds were deposited into the customer's wallet to close the ticket but we were faced with understanding the bug and how to prevent future occurrences.

After one or two hours, we discovered that the crypto indeed entered the system but two payments were made into the customer's address at the exact same time(second) by different external addresses. What is the chance of that happening? The second transaction overrode the first transaction thereby leaving no record of the first transaction.

After writing different algorithms and going through several pages of StackOverflow and Google and not finding the answer we decided to look at the Laravel documentation and to our surprise, we found answers to our question. P.S Always look at your language documentation when you are trying to find solutions to a problem 90% of the time the answers are contained in the documentation.

To the main part of this post, I am sorry it took this long to get here but I couldn't help explaining what brought me here.

Depending on your system architecture and according to the Laravel documentation, it is very important that you follow a good design pattern as this solution would not have been easy if good design patterns were not followed, there are two ways to go about avoiding or fixing the bug that we encountered in my company.

  1. Using the Unique Jobs

The Laravel job class implements a ShouldBeUnique interface. This interface does not need you to define any additional method in your class.

<?php

use Illuminate\Contracts\Queue\ShouldQueue;
use Illuminate\Contracts\Queue\ShouldBeUnique;

class UpdateSearchIndex implements ShouldQueue, ShouldBeUnique
{
    ...
}

Adding this interface to the queue class will not allow another instance of your job to run unless one has already been dispatched. You can also provide a uniqueId that makes the job unique and also a uniqueFor which is a timer for the unique job to expire.

<?php

use App\Product;
use Illuminate\Contracts\Queue\ShouldQueue;
use Illuminate\Contracts\Queue\ShouldBeUnique;

class UpdateSearchIndex implements ShouldQueue, ShouldBeUnique
{
    /**
     * The product instance.
     *
     * @var \App\Product
     */
    public $product;

    /**
     * The number of seconds after which the job's unique lock will be released.
     *
     * @var int
     */
    public $uniqueFor = 3600;

    /**
     * The unique ID of the job.
     *
     * @return string
     */
    public function uniqueId()
    {
        return $this->product->id;
    }
}

The uniqueId makes the job with the same id unique and only allows one instance of the job to be dispatched at a time.

  1. The second way is using the Laravel Job middleware.

Laravel includes an Illuminate\Queue\Middleware\WithoutOverlapping middleware that prevents jobs initiated by the same model, in this case, the same user, from overlapping and updating each instance based on a key. This particular one was the most effective in our case.

use Illuminate\Queue\Middleware\WithoutOverlapping;

/**
 * Get the middleware the job should pass through.
 *
 * @return array
 */
public function middleware()
{
    return [new WithoutOverlapping($this->user->id)];
}

You can read more about this and many of its usage in the Laravel documentation.