LARAVEL

Laravel Queues & Jobs Processing: Complete Guide

March 1, 2024 20 min read

Introduction

Laravel queues provide a unified API for processing time-consuming tasks in the background. This improves response times for users by moving heavy processing out of the request lifecycle.

Creating Jobs

Generate Job Class

php artisan make:job ProcessOrder

// app/Jobs/ProcessOrder.php
class ProcessOrder implements ShouldQueue
{
    use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;

    public $order;
    public $tries = 3;
    public $timeout = 120;

    public function __construct(Order $order)
    {
        $this->order = $order;
    }

    public function handle(): void
    {
        // Process the order
        $this->order->process();
    }
}

Job Properties

class ProcessOrder implements ShouldQueue
{
    // Number of retry attempts
    public $tries = 3;
    
    // Seconds to wait before retrying
    public $backoff = [10, 60, 300];
    
    // Maximum seconds job should run
    public $timeout = 120;
    
    // Delete job if models are deleted
    public $deleteWhenMissingModels = true;
    
    // Queue name
    public $queue = 'orders';
}

Dispatching Jobs

Basic Dispatch

// Dispatch immediately
ProcessOrder::dispatch($order);

// Dispatch after current request
ProcessOrder::dispatch($order)->afterResponse();

// Delayed dispatch
ProcessOrder::dispatch($order)->delay(now()->addMinutes(10));

// On specific queue
ProcessOrder::dispatch($order)->onQueue('high-priority');

// On specific connection
ProcessOrder::dispatch($order)->onConnection('redis');

Conditional Dispatch

// Dispatch conditionally
ProcessOrder::dispatchIf($condition, $order);
ProcessOrder::dispatchUnless($condition, $order);

// With unique ID
ProcessOrder::dispatch($order)->unique();

// Unique for given time
ProcessOrder::dispatch($order)->uniqueFor(3600);

Queue Connections

Configuration

// config/queue.php
'connections' => [
    'sync' => [
        'driver' => 'sync',
    ],
    
    'database' => [
        'driver' => 'database',
        'table' => 'jobs',
        'queue' => 'default',
        'retry_after' => 90,
    ],
    
    'redis' => [
        'driver' => 'redis',
        'connection' => 'default',
        'queue' => 'default',
        'retry_after' => 90,
        'block_for' => null,
    ],
    
    'sqs' => [
        'driver' => 'sqs',
        'queue' => env('SQS_QUEUE', 'default'),
        'region' => env('AWS_DEFAULT_REGION', 'us-east-1'),
    ],
],

Starting Workers

// Process default queue
php artisan queue:work

// Process specific queue
php artisan queue:work --queue=orders

// Daemon worker (preferred for production)
php artisan queue:work --daemon

// Process with supervisor
// /etc/supervisor/conf.d/laravel-worker.conf
[program:laravel-worker]
process_name=%(program_name)s_%(process_num)02d
command=php /var/www/html/artisan queue:work --sleep=3 --tries=3
autostart=true
autorestart=true
stopasgroup=true
killasgroup=true
user=www-data
numprocs=8
redirect_stderr=true
stdout_logfile=/var/log/laravel-worker.log
stopwaitsecs=3600

Job Chaining

Chain Jobs

// Chain multiple jobs
ProcessOrder::withChain([
    new NotifyCustomer($order),
    new UpdateInventory($order),
    new GenerateInvoice($order),
])->dispatch($order);

// With options
ProcessOrder::withChain([
    new NotifyCustomer($order),
    new UpdateInventory($order),
])->onConnection('redis')
  ->onQueue('orders')
  ->dispatch($order);

Batch Processing

use Illuminate\Bus\Batch;

$batch = Bus::batch([
    new ImportUsers($file1),
    new ImportUsers($file2),
    new ImportUsers($file3),
])->then(function (Batch $batch) {
    // All jobs completed
})->catch(function (Batch $batch, Throwable $e) {
    // First job failed
})->finally(function (Batch $batch) {
    // Batch complete
})->dispatch();

return $batch->id;

Failed Jobs

Handling Failures

// In job class
public function failed(\Throwable $exception): void
{
    // Send notification
    Notification::send($this->order->user, new JobFailed($this->order, $exception));
    
    // Log error
    Log::error('Job failed', [
        'order_id' => $this->order->id,
        'error' => $exception->getMessage(),
    ]);
}

Managing Failed Jobs

// List failed jobs
php artisan queue:failed

// Retry failed job
php artisan queue:retry 1

// Retry all failed jobs
php artisan queue:retry all

// Delete failed job
php artisan queue:forget 1

// Clear all failed jobs
php artisan queue:flush

Database Failed Jobs Table

php artisan queue:failed-table
php artisan migrate

Best Practices

  • Keep jobs small - Process only essential work in each job
  • Use unique jobs - Prevent duplicate processing
  • Implement proper timeouts - Set appropriate timeout values
  • Monitor job execution - Track job performance and failures
  • Use circuit breaker - Handle cascading failures
  • Implement retry logic - Use exponential backoff

Summary

Laravel queues provide powerful background processing capabilities. By properly implementing jobs, you can significantly improve your application's performance and user experience.

For more Laravel tutorials, check out Laravel Performance Optimization and Laravel Cache Optimization.