When dealing with time consuming resource intensive tasks, most PHP developers are tempted to choose the “quick hack route.” Don’t deny it! We’ve all used ini_set('max_execution_time', HUGE_INT); before, but it doesn’t have to be this way.

In today’s tutorial, I demonstrate how an application’s user experience may be improved (with minimal developer effort) by separating long-running tasks from the main request flow using Laravel. By making use of PHP’s ability to spawn separate processes that run in the background, the main script will respond faster to user action. Thereby, it better manages user expectations instead of making them wait for ages (without feedback) for a request to finish.

Defer long running PHP tasks, don’t wait.

The base concept of this tutorial is deferment; taking tasks that run for too long (by Internet standards) and instead deferring execution into a separate process that runs independently of the request. This deferment allows us to implement a notification system that shows the user the status of the task (X number of rows out of Y have been imported, for example) and alert the user when the job is done.

Our tutorial is based on a real life scenario that I’m sure you’ve encountered before: Taking data from huge Excel spreadsheets and pushing it into a web application database. The full project is available on my github.

Defer long running PHP tasks with Laravel.

Don’t make your users sit and wait on a long running task. Defer.

Bootstrapping with Laravel

We will be using "laravel/framework": "5.2.*" and "maatwebsite/excel": "~2.1.0"; a nice wrapper for the phpoffice/phpexcel package.

I chose to use Laravel for this particular task for the following reasons:

  1. Laravel comes with Artisan, which makes creating command line tasks a breeze. For those who don’t know Artisan, it’s the command line interface included in Laravel, driven by the powerful Symfony Console component
  2. Laravel has the Eloquent ORM for mapping our Excel data to table columns
  3. It is well maintained and it has a very thorough documentation
  4. Laravel is 100 percent ready for PHP 7; in fact, the Homestead box already runs PHP 7

While I choose to go with Laravel, the concept and code of this tutorial can be incorporated into any framework that also uses the Symfony/Process component (which you can install via composer using composer require symfony/process).

To begin, fire up your vagrant box based on Homestead (the standard when developing Laravel based applications these days). If you don’t have Homestead set up, the official documentation provides a thorough step-by-step guide.

With Homestead installed, you will need to modify Homestead.yamlbefore starting your vagrant box in order to do two things: Map your local development folder to a folder inside the virtual machine Automatically provision NGINX so that accessing a URL, such as http://heavyimporter.app, will load your new project.

Here’s what my configuration file looks like:

	    - map: ~/public_html/toptal
	      to: /home/vagrant/toptal

	    - map: heavyimporter.app
	      to: /home/vagrant/toptal/heavyimporter/public

	    - heavyimporter

Now, save the file and run vagrant up && vagrant provision, which starts the VM and configures it accordingly. If all went well, you can now log into your virtual machine with vagrant ssh, and start a new Laravel project. (If all didn’t go well, refer to Hashicorp’s Vagrant documentation for help.)

cd /home/vagrant/toptal && composer create-project --prefer-dist laravel/laravel heavyimporter

After creating the project, you will need to setup some configuration variables by editing the .env file in the home folder You should also secure your installation by running php artisan key:generate.

Here’s what the relevant parts of the .env file look like on my end:



Now add the maatwebsite/excel package by executing composer require maatwebsite/excel:~2.1.0.

You also need to add the service provider and the facade/alias in your config/app.php file.

Service providers are the core of a Laravel application; everything in Laravel is bootstrapped through a service provider, while facades are simple static interfaces that allow easier access to those service providers. In other words, instead of accessing the database (a service provider) with Illuminate\Database\DatabaseManager … you can just use DB::staticmethod().

For us, our service provider is Maatwebsite\Excel\ExcelServiceProvider and our facade is 'Excel'=>'Maatwebsite\Excel\Facades\Excel'.

app.php should now look like this:

	'providers' => [
	'aliases' => [

Setting up the Database with PHP Artisan

Let’s set up our database migrations for two tables. One table holds a flag with the status of the import, which we’ll call flag_table, and the one that has the actual Excel data, data.

If you intend to include a progress indicator to track the status of the import task, you need to add two more columns to the flag_table: rows_imported and total_rows. These two variables will allow us to calculate and deliver the percentage completed in the event that we want to show progress to the user.

First run php artisan make:migration CreateFlagTable and php artisan make:migration CreateDataTable to actually create these tables. Then, open the newly created files from database/migrations and fill the up and down methods with the table structure.

class CreateFlagTable extends Migration
    public function up()
        Schema::create('flag_table', function (Blueprint $table) {

    public function down()

class CreateDataTable extends Migration
    public function up()
        Schema::create('data', function (Blueprint $table) {
            $table->string('A', 20);
            $table->string('B', 20);

    public function down()

Before we actually write the import code, let’s create empty models for our database tables. This is achieved through Artisan by running two simple commands: php artisan make:model Flag and php artisan make:model Data, then going into each newly created file and adding the table name as a protected property of that class, like this:

	//file: app/Flag.php
	namespace App;

	use Illuminate\Database\Eloquent\Model;

	class Flag extends Model
	    protected $table = 'flag_table';
	    protected $guarded = []; //this will give us the ability to mass assign properties to the model

	//file app/Data.php
	class Data extends Model
	    protected $table = 'data';
	    protected $guarded = [];
	    protected $timestamps = false; //disable time stamps for this


Routes are the eyes of a Laravel application; they observe the HTTP request and point it to the proper controller. This being said, first, we need a POST route that assigns the task of uploading our Excel file to the import method in the controller. The file will be uploaded somewhere on the server so that we can grab it later when we execute the command line task. Be sure to place all your routes (even the default one) into the web middleware route group so that you benefit from session state and CSRF protection. The routes file will look like this:

	Route::group(['middleware' => ['web']], function () {
	    Route::get('/', ['as'=>'home', function () {
	        return view('welcome');
		//upload route
	    Route::post('/import', ['as'=>'import', 'uses'=>'[email protected]']);

The Task Logic

Now let’s turn our attention to the main controller, which will hold the core of our logic in a method that is responsible for the following:

  • Making the necessary validations related to the file type that’s being uploaded
  • Uploading the file to the server and adding an entry into the flag_table (which will be updated by the command line process once the task executes with the total number of rows and the current status of the upload)
  • Starting the import process (which will call the Artisan task) then return to let the user know that the process has been initiated

This is the code for the main controller:

	namespace App\Http\Controllers;


	use Maatwebsite\Excel\Facades\Excel;
	use Symfony\Component\Process\Process as Process;
	use Symfony\Component\Process\Exception\ProcessFailedException;
	use Illuminate\Http\Request;
	use Validator;
	use Redirect;
	use Config;
	use Session;
	use DB;
	use App\Flag;


   public function import(Request $request)
       $excel_file = $request->file('excel_file');

       $validator = Validator::make($request->all(), [
           'excel_file' => 'required'

       $validator->after(function($validator) use ($excel_file) {
           if ($excel_file->guessClientExtension()!=='xlsx') {
               $validator->errors()->add('field', 'File type is invalid - only xlsx is allowed');

       if ($validator->fails()) {
           return Redirect::to(route('home'))

       try {
           $fname = md5(rand()) . '.xlsx';
           $full_path = Config::get('filesystems.disks.local.root');
           $excel_file->move( $full_path, $fname );
           $flag_table = Flag::firstOrNew(['file_name'=>$fname]);
           $flag_table->imported = 0; //file was not imported
       }catch(\Exception $e){
           return Redirect::to(route('home'))
                       ->withErrors($e->getMessage()); //don't use this in production ok ?

      //and now the interesting part
       $process = new Process('php ../artisan import:excelfile');

       Session::flash('message', 'Hold on tight. Your file is being processed');
       return Redirect::to(route('home'));

The process-related lines above do something really cool. They use the symfony/process package to spawn a process on a separate thread, independently of the request. This means that the running script will not wait for the import to finish but instead it will redirect with a message to the user to wait until the import has completed. This way you can display a “import pending” status message to the user. Alternatively, you can send Ajax requests every X seconds to update the status.

Using only vanilla PHP, the same effect can be achieved with the following code, but of course this relies on exec, which is disabled by default, in many cases.

	function somefunction() {
		exec("php dosomething.php > /dev/null &");
		//do something else without waiting for the above to finish

The functionalities that symfony/process gives are more extensive than a simple exec, so if you are not using the symphony package, you can tweak the PHP script further after having a look at the Symphony package source code.

Using the Symfony package, you can spawn a PHP process on a separate thread, independently of the request.

Using the `symfony/process` package, you can spawn a PHP process on a separate thread, independently of the request.

The Import Code

Now let’s write a php artisan command file that handles the import. Start by creating the command class file: php artisan make:console ImportManager, then reference it in the $commands property in /app/console/Kernel.php, like this:

   protected $commands = [

Running the artisan command will create a file named ImportManager.php in the /app/Console/Commands folder. We will write our code as part of the handle() method.

Our import code will first update the flag_table with the total number of rows to be imported, then it will iterate through each Excel row, insert it in the database, and update the status.

To avoid out of memory issues with exceptionally large Excel files, it’s a good idea to process bite-sized chunks of the respective data-set instead of thousands of rows at once; a proposition that would cause lots of problems, not just memory issues.

For this Excel-based example, we will adapt the ImportManager::handle() method to fetch only a small set of rows until the entire sheet has been imported. This helps with keeping track of the task progress; after each chunk is processed, we update the flag_table by incrementing the imported_rows column with the size of the chunk.

Note: There is no need to paginate because Maatwebsite\Excel handles that for you as described in Laravel’s documentation.

Here’s what final ImportManager class looks like:

namespace App\Console\Commands;

use Illuminate\Console\Command;

use DB;
use Validator;
use Config;
use Maatwebsite\Excel\Facades\Excel;

use App\Flag;

class ImportManager extends Command
   protected $signature = 'import:excelfile';
   protected $description = 'This imports an excel file';
   protected $chunkSize = 100;

   public function handle()
       $file = Flag::where('imported','=','0')
                   ->orderBy('created_at', 'DESC')

       $file_path = Config::get('filesystems.disks.local.root') . '/' .$file->file_name;

      // let's first count the total number of rows
       Excel::load($file_path, function($reader) use($file) {
           $objWorksheet = $reader->getActiveSheet();
           $file->total_rows = $objWorksheet->getHighestRow() - 1; //exclude the heading

      //now let's import the rows, one by one while keeping track of the progress
           ->chunk($this->chunkSize, function($result) use ($file) {
               $rows = $result->toArray();
              //let's do more processing (change values in cells) here as needed
               $counter = 0;
               foreach ($rows as $k => $row) {
                   foreach ($row as $c => $cell) {
                       $rows[$k][$c] = $cell . ':)'; //altered value :)
                   DB::table('data')->insert( $rows[$k] );
               $file = $file->fresh(); //reload from the database
               $file->rows_imported = $file->rows_imported + $counter;

       $file->imported =1;

Recursive Progress Notification System

Let’s move on to the front-end part of our project, user notification. We can send Ajax requests to a status-reporting route in our application to notify the user of progress or alert them when the import is done.

Here’s a simple jQuery script that will send requests to the server until it receives a message stating that the job is done:

       'use strict';
		function statusUpdater() {
			}).done(function(r) {
				if(r.msg==='done') {
				    console.log( "The import is completed. Your data is now available for viewing ... " );
				} else {
					//get the total number of imported rows
					console.log("Status is: " + r.msg);
					console.log( "The job is not yet done... Hold your horses, it takes a while :)" );
			  .fail(function() {
				  console.log( "An error has occurred... We could ask Neo about what happened, but he's taken the red pill and he's at home sleeping" );

Back on the server, add a GET route called status, which will call a method that reports the current status of the import task as either done, or the number of rows imported out of X.

	Route::get('/status', ['as'=>'status', 'uses'=>'[email protected]']);

   public function status(Request $request)
       $flag_table = DB::table('flag_table')
                       ->orderBy('created_at', 'desc')
       if(empty($flag)) {
           return response()->json(['msg' => 'done']); //nothing to do
       if($flag_table->imported === 1) {
           return response()->json(['msg' => 'done']);
       } else {
           $status = $flag_table->rows_imported . ' excel rows have been imported out of a total of ' . $flag_table->total_rows;
           return response()->json(['msg' => $status]);

Send Ajax requests to a status-reporting route to notify the user of progress.

Send Ajax requests to a status-reporting route to notify the user of progress.

Cron Job Deferment

Another approach, when data retrieval is not time sensitive, is to handle the import at a later time when the server is idle; say, at midnight. This can be done using cron jobs that execute the php artisan import:excelfile command at the desired time interval.

On Ubuntu servers, it is as simple as this:

crontab -e
#and add this line
@midnight cd path/to/project && /usr/bin/php artisan import:excelfile >> /my/log/folder/import.log

What’s Your Experience?

Have you other suggestions for further improving performance and user experience in similar cases? I would be eager to know how you’ve dealt with them.

About the author

Daniel Gheorghe, Romania
member since January 6, 2016
Daniel is a full-stack web developer and Zend Certified PHP Engineer. He started coding PHP in 2005 and, since then, he has worked on tens of successful web development projects. With a decade of experience as the lead software engineer for multiple US web development companies, Daniel can take on a wide range of projects while maintaining great communication and providing valuable expertise. [click to continue...]
Hiring? Meet the Top 10 Freelance Laravel Developers for Hire in March 2019


Daniel Gheorghe
Let me know if you guys have any questions / feedback.
Out of curiosity, what made you use Symfony Process vs. the Laravel Job Queue?
Radosław Górny
Have you heard of queues? I think they fit the need of asynchronous tasks better.
Piero Recchia
Great article, i used a similar aproach but using rabbitMQ to queued the process and socket.io to notify the user, but your handle of excel file is better than mine, thanks for share this.
Daniel Gheorghe
Thanks Piero! Do you have it published anywhere ? I would love to see it. Thanks,
Daniel Gheorghe
Hi Radoslaw, Thanks for reading this and for your comment. Why do you say they're better ? Just curious. As far as I can see queues make use of Symphony/Process - https://github.com/illuminate/queue/blob/0b20072bf8e27ab39e3a1ec4e8daa7430c0cbcbe/Listener.php For me just creating the queue job wouldn't have been enough as it wouldn't have illustrated how this can be achieved using another framework. Thanks,
Daniel Gheorghe
Hi Tully, Thanks for reading this. Just creating the queue job wouldn't have been enough as it wouldn't have illustrated how this can be achieved using another framework and/or no framework. Thanks,
Radosław Górny
Queues are great, because you have greater control over running those tasks. You can give them priorities, delay them etc. They also support several providers, including Redis.
... but Laravel is built on top of Symfony, and this is a specific example for Laravel (as in the title: Handling Intensive Tasks with Laravel). Nothing in this is that useful outside Laravel. Just wondering because the Laravel Job queue is the Laravel way to handle this and it isn't referenced anywhere in the article I could see. I'd have liked to see an example of a chunk import utilizing the Job queue rather than the Symfony method. Perhaps in an update to the article?
Daniel Gheorghe
Yep! Great idea for a follow up article. Thanks!
One more suggestion: the AJAX polling method of getting status updates, while quick and dirty as provided and will get the job done, are perhaps better handled using Laravel's built in event broadcasting and something like Pusher. Perhaps another update :D
Also i would not pull that often ;)
I would idd use queues. I'dd also would use Laravel event with Redis/SocketIO to handle the status event you now have in jQuery. https://laravel.com/docs/5.2/events#broadcasting-events
> cd path/to/project && /usr/bin/php artisan import:excelfile this is overly verbose. since php is in user's path and since directory script resides in is current for the script, this can be simplified to just: > php /path/to/project/artisan import:excelfile
as a followup, adding tasks directly to crontab is discouraged as they live outside of version control this way. add `* * * * * php artisan schedule:run` instead and use laravel's built-in task scheduler. it also provides `->withoutOverlapping()` method that prevents two copies of a command from running simultaneously.
Can you share or explain, how you have done it. -thanks
Did you read the link? It's quite straight forward I might say. Create an Event in Laravel. Make sure it uses the BroadCast interface. Fire this event when your task completes like this: Event::fire(new \App\Events\ViewEvent($id)); In this case, my event is called 'ViewEvent' (Like the ga page view)
Muhammad Said El Zahlan
How did you manage to prevent php from killing the process after redirect ?
Deric Lima
Great article Daniel!! Thanks for sharing this o//
Syed Ali Hamza Bukhari
My controller directory is "/Controllers/Orders/Import/ImportController" and I am using command in new process as $process = new Process('php ../../../artisan import:SyncOrders '.$channel_id.' '.$notification->id); but its not executing the code but if i manually run php artisan import:SyncOrders $channel_id $notification_id then it works. What I am doing wrong?
Syed Ali Hamza Bukhari
Wellington NaGaTo
Really cool, works perfecly! ;D
comments powered by Disqus
Free email updates
Get the latest content first.
No spam. Just great articles & insights.
Free email updates
Get the latest content first.
Thank you for subscribing!
Check your inbox to confirm subscription. You'll start receiving posts after you confirm.
Trending articles
Relevant Technologies
About the author
Daniel Gheorghe
PHP Developer
Daniel is a full-stack web developer and Zend Certified PHP Engineer. He started coding PHP in 2005 and, since then, he has worked on tens of successful web development projects. With a decade of experience as the lead software engineer for multiple US web development companies, Daniel can take on a wide range of projects while maintaining great communication and providing valuable expertise.