DevOps

Building a PHP Command Line App with Docker

Over the past few months, I’ve started using Docker for all my local development work. Previously I had experimented with VMs, but because we deal in microservices, running six or more VMs on my Macbook Pro was starting to get cumbersome.

Docker has clear advantages in system utilization, but I also like that my local configuration can easily be used on servers and continuous integration environments as well. I’m not the only developer discovering the advantages to containerized workflows; Docker has been growing exponentially over the past year and a half, and search volume has been increasing steadily.

While most of my work with Docker has been in local development, we recently had our first opportunity to ship a new production service with it. In Part 1 of this tutorial, I’ll walk you through the process of setting up a Laravel PHP command-line application to run in Docker. Look for Part 2 next week, when I’ll go over the process for continuous integration and deployment using Codeship Pro.

Project Overview

At The Graide Network, we needed to automate a bunch of reminder emails and text messages being sent to our users. Every hour, we wanted to go into the database and find any reminders that needed to be sent, then queue up jobs to send each of them.

Since a lot of the code in our project is specific to what we do, I am simplifying it for this tutorial. This guide will cover the key parts of our system and hopefully give you a starting point for developing and deploying a Dockerized PHP command-line app.

Here’s what we’re going to cover in this tutorial:

  • Setting up a Laravel PHP command-line application
  • Writing a cron job to run our command every hour within a Docker container
  • Adding an acceptance test to verify that our command-line job works

We will use a network of Docker containers to host a typical Laravel application stack: Linux, MySQL, and PHP. Because this app is triggered by the command line only and doesn’t require an HTTP interface, I won’t cover running Apache or NGINX, but that is certainly a good next step if you’re looking to use this in your own application.

If you’d like to see the finished product, I’ve made it available on GitHub. Okay, let’s get started!

Prerequisites

I’m going to assume you have Docker (now available for Mac, Windows, or Linux) installed. While many developers probably have PHP installed locally, you don’t actually need it to run this application. One of the best things about developing with Docker is that you don’t have to install or update all your languages and tools locally. The containers will take care of that for you.

If you choose to use the GitHub repo and have Node.js installed, you can opt to use the NPM commands. This is not required, but as you will see, it can make your life at the command line slightly easier.

Creating the Laravel Application

We’re going to create a simple Laravel command-line application with Docker to run on our local machine.

Laravel is a modern PHP framework with components for routing HTTP requests, CLI commands, queues and workers, an ORM, and much more. Because Laravel includes most of the features for our application out of the box, the code we have to write is pretty simple.

Installing Laravel using Docker

First, let’s create a new Laravel application. But instead of using a locally installed version of Composer, we’re going to use the Docker container available on Docker Hub:

$ docker run --rm -v $(pwd):/app composer/composer create-project --prefer-dist laravel/laravel docker-php-cli-example

This command will download the latest build of the Composer container, create a new Laravel project called docker-php-cli-example in your current directory, and install all the composer dependencies.

Creating an Artisan command

Next, we need to create an Artisan command. Artisan is Laravel’s command-line helper, and any Artisan command you create in your app can be run from the application’s root directory with php artisan YOUR_COMMAND_NAME.

To create a new Artisan command using a Docker container (rather than your local version of PHP), navigate to the new folder we created and run:

$ docker run --rm -v $(pwd):/app -w /app php:cli php artisan make:command UpdateNextItem

This will download the latest PHP CLI image, put your Laravel app in a volume, and then create a new command in the folder app/Console/Commands.

Because Artisan commands and Composer updates are very common, I usually create some NPM scripts to wrap these long Docker commands. If you’d like, you can see some of them in the example project repository I set up on GitHub.

Next, add the newly created command to the app/Console/Kernel.php class:

/** Code for new command **/
<?php namespace App\Console;

use App\Console\Commands\UpdateNextItem;
use Illuminate\Console\Scheduling\Schedule;
use Illuminate\Foundation\Console\Kernel as ConsoleKernel;

class Kernel extends ConsoleKernel
{
   /**
    * The Artisan commands provided by your application.
    *
    * @var array
    */
   protected $commands = [
       UpdateNextItem::class,
   ];

And finally, let’s put some code into the UpdateNextItem command we just created:

/** Code for new command **/
<?php namespace App\Console\Commands;

use App\Item;
use Carbon\Carbon;
use Illuminate\Console\Command;

class UpdateNextItem extends Command
{
    /**
     * The name and signature of the console command.
     *
     * @var string
     */
    protected $signature = 'item:update';
    
    /**
     * The console command description.
     *
     * @var string
     */
    protected $description = 'Gets the next item in the database and updates it.';
    
    /**
     * Execute the console command.
     *
     * @return mixed
     */
    public function handle()
    {
        // Get the next item to be updated
        $nextItem = Item::orderBy('checked_at', 'asc')->first();
        // Update the checked_at time
        $nextItem->checked_at = Carbon::now()->toDateTimeString();
        // Save the model
        $nextItem->save();
        // Display a message to the command line
        $this->info("Item #{$nextItem->id} updated");
    }
}

At this point, our command won’t run because we haven’t actually created the data model or database migration, so let’s do that next.

Creating a database migration, seeder, and model

Laravel also makes it easy to create a database migration and model, so let’s run a command in Docker to generate that:

$ docker run --rm -v $(pwd):/app -w /app php:cli php artisan make:model Item -m

And now, update the new migration’s up() method to contain a checked_at column like so:

public function up()
{
    Schema::create('items', function (Blueprint $table) {
        $table->increments('id');
        $table->timestamp('checked_at');
        $table->timestamps();
    });
}

You should also remove the default Laravel migrations, as we won’t use users or authentication for this app.

Next, we want to allow the checked_at column to be updated by our command, so add this array to the Item.php model that we just generated:

/**
 * The attributes that are mass assignable.
 *
 * @var array
 */
protected $fillable = [
    'checked_at',
];

Finally, in order to test this new model, we will want to seed some data. Open up the database/seeds/DatabaseSeeder.php file and update the run() method with some randomly generated data:

/**
 * Run the items database seeder.
 *
 * @return void
 */
public function run()
{
    DB::table('items')->truncate();
    $numberOfItems = rand(10, 100);
    
    for ($i = 0; $i < $numberOfItems; $i++) {
    
        // Create a datetime in the past 100 minutes
        $lastChecked = Carbon::now()
            ->subMinutes(rand(1, 100))
            ->toDateTimeString();
            
        // Insert a new record in the database
        DB::table('items')->insert([
            'checked_at' => $lastChecked,
        ]);
    }
}

Now our Laravel application is mostly complete, but we need a way to quickly spin up a database container and our PHP code at the same time and link them together. So far, we’ve just been using single docker run commands, which run a single process and then shut down. While you can run your database and application containers like this as well, I’ve found it much easier to use Docker Compose.

Running the Application with Docker Compose

Most real applications require several containers running at the same time to work properly, and Docker Compose helps you manage multiple linked Docker containers. This application only needs two containers to run: a PHP command line container and a database container.

In this tutorial, we’ll create one custom image to run our application code, and use MariaDB’s standard image for our database. Then we’ll use Docker Compose to run them as containers.

Adding a cron job

Before we build the Dockerfile for our application, we need to add a simple cron job to run our artisan command automatically. Create a new folder called docker and a file within it called crontab in your project. The crontab file should have one command with an empty line after:

0 * * * * root /usr/local/bin/php /app/artisan item:update

This will run the artisan command item:update every hour when implemented within this container.

Creating a Dockerfile for our application

Next, we’ll create a Dockerfile for our PHP application. The Dockerfile is a series of commands that will be used to build an image, and that image will be run as a container. This Dockerfile should go in the root directory of your project:

FROM php:cli

# Install mysql and cron
RUN apt-get update && apt-get install -y \
    libpq-dev \
    cron \
    mysql-client
RUN docker-php-ext-install pdo pdo_mysql

# Add crontab file in the cron directory
ADD docker/crontab /etc/cron.d/cron

# Give execution rights on the cron job
RUN chmod 0644 /etc/cron.d/cron

ADD ./ /app

WORKDIR /app

When built, the Dockerfile starts with the PHP CLI image as a base, then adds MySQL drivers, the cron job runner, and our crontab file that we created in the last step. It also copies the project directory into the container.

Updating the .env file

Both Laravel and Docker support .env files, so in order to keep things simple, we’re going to create a .env file for both:

APP_ENV=local
APP_KEY=YOUR_LARAVEL_APP_KEY
APP_DEBUG=true
APP_LOG_LEVEL=debug
APP_URL=http://localhost

DB_CONNECTION=mysql
DB_HOST=database
DB_PORT=3306
DB_DATABASE=laravel
DB_USERNAME=root
DB_PASSWORD=a_good_strong_password
MYSQL_ROOT_PASSWORD=a_good_strong_password

The most important part is the database connection details. We also need to set a MySQL root password. If you’re planning on using this project in production, it would be a good idea to modify it to create a new MySQL user instead of running as root, but we’re going to keep it simple for the time being.

The docker-compose.yml file

Now that we have a Dockerfile and our configuration files set up, we can use a Docker Compose file to bring the whole thing up. At the root of your directory, create a file called docker-compose.yml:

version: "2.0"
services:
  # PHP Application
  app:
    build: .
    links:
      - database
    env_file: .env
    command: cron -f
  # Database
  database:
    image: mariadb
    env_file: .env

This Compose file basically does three things:

  1. Creates the database container from the MariaDB image on Docker Hub.
  2. Creates the application container from our local Dockerfile that we just created.
  3. Links them together. It also gives our application container a command to run at startup (cron -f), which will initialize the cron job that runs every hour.

I should also note that this Compose file is the bare minimum to get running. There are a ton of configuration options available including mounting volumes for your code or data, restarting your containers automatically, and connecting to existing Docker networks. In a real application, you may have multiple Compose files for different environments, but hopefully this one will get you started.

Manually testing it out

You are now ready to try your application out and see if everything works.

Since this is the first time running it, we want to see what’s going on. Open one terminal window and run $ docker-compose up --build. This will build the local application image, pull down the latest image for MariaDB, and run the containers in the terminal window. Usually you will want to run the “up” command with the -d flag to make it run in the background, but since it’s your first time, it might be helpful to see what’s going on.

Next, you can use another terminal window to see which containers are running: $ docker ps.

You should see something like this:

CONTAINER ID        IMAGE                     COMMAND                  CREATED              STATUS              PORTS               NAMES
597ab6be26d6        dockerphpcliexample_app   "docker-php-entryp..."   About a minute ago   Up About a minute                       dockerphpcliexample_app_1
d61f7f36d0ba        mariadb                   "docker-entrypoint..."   4 minutes ago        Up About a minute   3306/tcp            dockerphpcliexample_database_1

This means that both of your containers started properly and are running.

Every time you bring up a container, it’s like you installed everything fresh, so you’ll need to configure your database and run the migrations. To create your database:

$ docker exec dockerphpcliexample_app_1 mysql -h'database' -p'a_good_strong_password' -e 'CREATE DATABASE IF NOT EXISTS laravel'

It can take from a few seconds to a minute or two to bring up MariaDB. If you get an error like Can't connect to MySQL server on 'database' (111) the first time you try this command, wait a minute and try again.

To run the migration we created:

$ docker exec dockerphpcliexample_app_1 php artisan migrate

To seed the database:

$ docker exec dockerphpcliexample_app_1 php artisan db:seed

And finally, if all those commands went through okay, you can try running the Artisan command we created way back at the beginning of this tutorial:

$ docker exec dockerphpcliexample_app_1 php artisan item:update

You should get a result like Item #12 updated each time you run the command. When you’re done, bring down your Docker containers by running docker-compose down.

Automated testing

Continuous integration isn’t as much use without some tests to run, so we’re going to add a single acceptance test that verifies that our application’s single artisan command will run when the database is set up and seeded. In real life, our test suite will likely be more complicated than this, but for the sake of simplicity, let’s just add one test in the tests/Feature/Example.php file:

/** adding a test **/
<?php namespace Tests\Feature; 

use Illuminate\Support\Facades\Artisan;
use Tests\TestCase;

class ExampleTest extends TestCase
{
    public function setUp()
    {
        // Seed the database before each run
        parent::setUp();
        Artisan::call('db:seed', array('--class'=>'DatabaseSeeder'));
    }
    /**
     * Run the artisan command
     *
     * @return void
     */
    public function testItCanRunItemUpdate()
    {
        Artisan::call('item:update');
        $this->assertRegExp("/Item #[0-9]* updated\\n/", Artisan::output());
    }
}

Now, to run the test, we need to bring our containers back up, create the database, run the migrations, and then finally run the test suite:

$ docker-compose up -d --build
## You may need to wait a few seconds for the database container to come up
$ docker exec dockerphpcliexample_app_1 mysql -h'database' -p'a_good_strong_password' -e 'CREATE DATABASE IF NOT EXISTS laravel'
$ docker exec dockerphpcliexample_app_1 php artisan migrate
$ docker exec dockerphpcliexample_app_1 vendor/bin/phpunit
PHPUnit 5.7.19 by Sebastian Bergmann and contributors.

.                                                                   1 / 1 (100%)

Time: 1.04 seconds, Memory: 12.00MB

OK (1 test, 1 assertion)

Now we have a working PHP command-line application with tests, but as you can see, these Docker commands are pretty long and annoying to type out over and over again. I like to wrap them in NPM scripts (click here for an example), but this is totally optional. There’s a valid argument against abstracting away Docker commands, as it can prevent new developers from knowing what’s going on behind the scenes. But once you do know how each command works, there’s no harm in making your life easier.

Be sure to read Part 2 of this tutorial to learn how to take advantage of Codeship Pro (freshly updated with a new UI) for continuous integration and deployment. If you have any problems running this tutorial, feel free to add an issue to the GitHub repository I set up or find me on Twitter.

Reference: Building a PHP Command Line App with Docker from our WCG partner Karl Hughes at the Codeship Blog blog.

Karl Hughes

Karl Hughes is a startup fanatic, engineering team lead, and CTO at The Graide Network.
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Inline Feedbacks
View all comments
Back to top button