GushPHP.org Site now Bears Documentation: Contribute Linking Other Commands!

Tonight I spend sometime on the https://github.com/gushphp/gush-site repository which is the site source code that is served over http://gushphp.org.

The site is built on sculpin and for a long time I was struggling with solving the problem of getting the command documentation over to that page in a more automated way.

So i connected the documentation link on the top navigation bar to a documentation page.

Screenshot 2014-09-26 05.17.37

The documentation page right now bears a couple of linked commands. However when you click on one of them for instance you will land into the commands pages. Right now they need some contributions to beautify them, but the essence is there:

http://gushphp.org/commands/branch_changelog/

Screenshot 2014-09-26 05.19.28

The generation takes advantage of a script located in the gushphp/gush repo, generate_docu. This script writes .md files formatted so that they are ready to be placed under source/commands folder under the gushphp/gush-site repository.

Soon we will connect all the commands, we need help now just making it look prettier and complete the links, will you help us please!

Thanks!

Doctrine Migrations with Schema API without Symfony: Symfony CMF SeoBundle Sylius Example

Usually when we have a project in Symfony we rely on doctrine migrations bundle. The problem with this approach is often that is too magic and have some misunderstandings on how to do proper migrations. There was a rewrite of the doctrine/migrations package by @beberlei sometime ago. Even though it is a pending PR to this day, I believe several other frameworks and community can benefit from using the component standalone with doctrine DBAL or other.

Here I will show how I used this component to do my migrations using the Schema API from Doctrine. Let’s start!

First let’s require the puppy:

    // inside your composer.json add and run update doctrine/migrations
    "doctrine/migrations":  "dev-Rewrite@dev"

Let’s dive into the command that will run things for us. Yes is a Symfony command and if you don’t want it to become container aware, you can easily inject the migrations service. However for those locked up in the den here is a version coupled:

<?php
 
namespace Vendor\DieBundle\Command;
 
use Doctrine\Migrations\Migrations;
use Symfony\Bundle\FrameworkBundle\Command\ContainerAwareCommand;
use Symfony\Component\Console\Input\InputArgument;
use Symfony\Component\Console\Input\InputInterface;
use Symfony\Component\Console\Output\OutputInterface;
 
class MigrationCommand extends ContainerAwareCommand
{
    const COMMAND_SUCCESS = 0;
    const COMMAND_FAILURE = 1;
 
    /** @var  Migrations */
    protected $migrations;
 
    protected function configure()
    {
        $this
            ->setName('doctrine:migrations')
            ->setDescription('migrates database')
            ->addArgument('step', InputArgument::REQUIRED, 'step')
            ->setHelp(<<<EOT
This command has a mandatory argument that needs to be either: repair, boot, migrate or info.
 
EOT
            )
        ;
    }
 
    protected function execute(InputInterface $input, OutputInterface $output)
    {
        $this->migrations = $this->getContainer()->get('vendor_migrations');
 
        switch ($step = $input->getArgument('step')) {
            case 'repair':
                $this->migrations->repair();
                break;
            case 'boot':
                try {
                    $this->migrations->initializeMetadata();
                } catch (\Exception $e) {
                    $output->writeln('Already booted/initialized.');
                }
                break;
            case 'migrate':
                $this->migrations->migrate();
                break;
            case 'info':
            default:
                $status = $this->migrations->getInfo();
                $output->writeln($status->isInitialized() ? 'is initialized' : 'is not initialized');
                $output->writeln('Executed migrations:');
                foreach ($status->getExecutedMigrations() as $migration) {
                    /** @var \Doctrine\Migrations\MigrationInfo $migration */
                    $output->writeln(sprintf('  - %s', $migration->getVersion()));
                }
                $output->writeln('Outstanding migrations:');
                foreach ($status->getOutstandingMigrations() as $migration) {
                    $output->writeln(sprintf('  - %s', $migration->getVersion()));
                }
                break;
        }
 
        $output->writeln(sprintf('Executed step %s', $step));
 
        return self::COMMAND_SUCCESS;
    }
}

This command uses certain services coming from the package that we have not seen yet. It allows us to run the migrate command. The initialization thing is just to plug a table for migrations tracking. The info outputs nicely a list of the migrations from this table showing executed migration versions and also outstanding migrations.

Let’s take a deep look into the extension that sets these services for us (of course this is coded added by me to accomplish setup):

    // inside your extension class
class VendorDieExtension extends Extension
{
    /**
     * {@inheritDoc}
     */
    public function load(array $configs, ContainerBuilder $container)
    {
        // ...
        $this->defineMigrationsServiceAndConfiguration($container);
    }
 
    private function defineMigrationsServiceAndConfiguration(ContainerBuilder $container)
    {
        $migrationsArray = array(
            'db' => array(
                'driver' => $container->getParameter('database_driver'),
                'host' => $container->getParameter('database_host'),
                'port' => $container->getParameter('database_port'),
                'dbname' => $container->getParameter('database_name'),
                'user' => $container->getParameter('database_user'),
                'password' => $container->getParameter('database_password')
            ),
            'migrations' => array(
                'script_directory' => $container->getParameter('kernel.root_dir').'/Migrations/',
                'allow_init_on_migrate' => true,
                'validate_on_migrate' => true,
                'allow_out_of_order_migrations' => false
            )
        );
 
        $container->setParameter('vendor_migrations_array', $migrationsArray);
 
        $factory = new Definition('Doctrine\Migrations\DBAL\Factory');
 
        $container->setDefinition('vendor_migrations_factory', $factory);
 
        $migrations = new Definition('Doctrine\Migrations\Migrations');
        $migrations
            ->setFactoryService('vendor_migrations_factory')
            ->setFactoryMethod('createFromArray')
            ->addArgument($migrationsArray)
        ;
 
        $container->setDefinition('vendor_migrations', $migrations);
    }

With this we have created the service vendor_migrations which we call inside our command. All is set and we can see that we have specified where will our migrations be in our project. So let’s take a look at a sample migration:

<?php
 
use Doctrine\DBAL\Schema\Column;
use Doctrine\DBAL\Schema\ForeignKeyConstraint;
use Doctrine\DBAL\Schema\Table;
use Doctrine\DBAL\Schema\TableDiff;
use Doctrine\DBAL\Types\Type;
use Doctrine\Migrations\DBAL\DBALMigration;
use Doctrine\DBAL\Connection;
use Doctrine\DBAL\Schema\Index;
 
class V0002_from_prod_23092014_to_seo implements DBALMigration
{
    public function migrate(Connection $connection)
    {
        $schemaManager = $connection->getSchemaManager();
 
        $seoTable = new Table('cmf_seo_metadata');
        $seoTable->addColumn('id', 'integer', array('length' => 10000, 'notnull' => true, 'autoincrement' => true));
        $seoTable->addIndex(array('id'), 'id');
        $seoTable->addColumn('title', 'string', array('length' => 255, 'notnull' => false));
        $seoTable->addColumn('metaDescription', 'string', array('length' => 255, 'notnull' => false));
        $seoTable->addColumn('metaKeywords', 'string', array('length' => 255, 'notnull' => false));
        $seoTable->addColumn('originalUrl', 'string', array('length' => 255, 'notnull' => false));
        $seoTable->addColumn('extraNames', 'string', array('length' => 1000, 'notnull' => false, 'comment' => '(DC2Type:array)'));
        $seoTable->addColumn('extraProperties', 'string', array('length' => 1000, 'notnull' => false, 'comment' => '(DC2Type:array)'));
        $seoTable->addColumn('extraHttp', 'string', array('length' => 1000, 'notnull' => false, 'comment' => '(DC2Type:array)'));
        $seoTable->setPrimaryKey(array('id'));
 
        $schemaManager->createTable($seoTable);
 
        $tableDiff = new TableDiff('vendor_media_items', array(new Column('seo_metadata', Type::getType('integer'), array('notnull' => false))));
        $schemaManager->alterTable($tableDiff);
 
        $keyConstraint = new ForeignKeyConstraint(array('seo_metadata'), 'cmf_seo_metadata', array('id'), 'FK_ADE411B7AEB39536');
        $schemaManager->createConstraint($keyConstraint, 'vendor_media_items');
 
        $schemaManager->createIndex(new Index('UNIQ_ADE411B7AEB39536', array('seo_metadata'), true), 'vendor_media_items');
 
        $tableDiff = new TableDiff('sylius_product', array(new Column('seo_metadata', Type::getType('integer'), array('notnull' => false))));
        $schemaManager->alterTable($tableDiff);
 
        $keyConstraint = new ForeignKeyConstraint(array('seo_metadata'), 'cmf_seo_metadata', array('id'), 'FK_677B9B74AEB39536');
        $schemaManager->createConstraint($keyConstraint, 'sylius_product');
 
        $schemaManager->createIndex(new Index('UNIQ_677B9B74AEB39536', array('seo_metadata'), true), 'sylius_product');
 
        $tableDiff = new TableDiff('sylius_taxon', array(
                new Column('seo_metadata', Type::getType('integer'), array('notnull' => false)),
                new Column('meta_title', Type::getType('string'), array('notnull' => false)),
                new Column('meta_description', Type::getType('string'), array('notnull' => false)),
            )
        );
        $schemaManager->alterTable($tableDiff);
 
        $keyConstraint = new ForeignKeyConstraint(array('seo_metadata'), 'cmf_seo_metadata', array('id'), 'FK_CFD811CAAEB39536');
        $schemaManager->createConstraint($keyConstraint, 'sylius_taxon');
 
        $schemaManager->createIndex(new Index('UNIQ_CFD811CAAEB39536', array('seo_metadata'), true), 'sylius_taxon');
 
        // forgotten change from before
        $tableDiff = new TableDiff('sylius_variant', array(), array(), array(new Column('revision', Type::getType('integer'))));
        $schemaManager->alterTable($tableDiff);
    }
}

The migration is from a project using Sylius and the Symfony CMF SEO Bundle. That is why the names look familiar :). It is a real working example!

Don’t be scared, this above is the translation of the SQL output gotten from app/console doctrine:schema:update –dump-sql, but digested with the Schema API from doctrine. I actually finds it makes more sense to write this migration as we develop or add model persistence in a project and just verifying this with the dumper. I hardly found good documentation on the API but finally after some troubleshooting feel a lot comfortable with this API and also the new version of the doctrine/migration library.

The migration runs like a charm and tested on development we can be sure it will work well on production. There is a suggestion to use tools external to PHP however this is a no go since projects like wordpress or others would want their plugins to do these migrations. So this is a good option!

Encouragements in all good, and please retweet to support me writing.

Thanks!

Testing with the StreamWrapper File System Libraries

I was working on Bldr, the nice task runner tool to automate development. Some days ago I ran into a problem. Usually when you work with bldr-io, it has you create at least several files on the project root directory. We have the bldr.json to specify the external dependencies.

For instance in my case I want to use gush-bldr-block so I have:

{
    "require": {
        "bldr-io/gush-block": "dev-master@dev"
    },
    "config": {
        "vendor-dir": "app/build/vendor",
        "block-loader": "app/build/blocks.yml"
    }
}

Before I couldn’t have pointed to block-loader successfully. It was not finding the blocks that it downloaded under the app/build/vendor folder. Feel confused? Let me rewind. Bldr has composer built-in. This means that you run bldr install to read the bldr.json and install additional dependencies to the composer.json native project dependencies. Bldr will use your composer.json and in addition the bldr.json to create its autoload. The idea is that in its autoload it will load blocks or third party packages that will be used to build your project.

Why would you need to add more dependencies to your project? Because packages to build projects can be reused. Common tasks for maintenance and others can be packaged in this way. Now if we build a package that you don’t like but it is included with bldr it will be bloated unnecessarily. By doing this then bldr allows for customization of the same tool. Some other tools choose extensions however you have to plug their dependencies into the composer.json. Bldr’s embedded composer splits these dependencies and yet ensures they are compatible or compatible with your project packages other requirements.

imports:
    - { resource: app/build/profiles.yml }
    - { resource: app/build/tasks.yml }

bldr:
    name: vendor/project
    description: some description

Bldr needs files like bldr.yml, .bldr folder that contains tasks.yml and profiles.yml if you do import these from bldr.yml. In addition it will generate a bldr.lock because of the bldr.json. That is a lot of files. Since this is configuration related, I think it is better to store them under app/build say on a project with an app folder that stores mostly configuration.

So Instead of having all these files spread out into the root of the project we move them there and it looks very clean.

Screenshot 2014-09-22 02.32.17
You can keep creating your custom blocks and rerun bldr install.

So we saw the need of a PR, now let’s look at how I wrote the PR and the test for it. The main change happened because bldr had harcoded the path for finding the third party blocks:

     public function getThirdPartyBlocks()
     {
         /** @var Application $application */
         $application = $this->get('application');
-        $blockFile   = $application->getEmbeddedComposer()->getExternalRootDirectory().'/.bldr/blocks.yml';
+        $embeddedComposer = $application->getEmbeddedComposer();
+        $config = $embeddedComposer->getExternalComposerConfig();
+        $loadBlock = $config->has('block-loader') ? $config->get('block-loader') : '.bldr/blocks.yml';
+        $blockFile = $embeddedComposer->getExternalRootDirectory().DIRECTORY_SEPARATOR.$loadBlock;

Instead of hard coding it to /.bldr/blocks.yml now we ask from the embedded composer instance that has already read the bldr.json to give us that information under block-loader key. This is in turn constructed based on the root directory also provided by embedded composer.

Because the ContainerBuilder class in which we were tests for the existence of the blocks.yml file under certain folders we have to come up with a cleaner way to test for file existence without polluting the folders with fixtures.

         "phpunit/phpunit":           "~4.2.0",
+        "mikey179/vfsStream":        "~1.4.0",

Welcome to vfsStream package! This little thing is very good for testing using the stream wrapper from PHP. Let’s see the final test:

<?php
 
/**
 * This file is part of Bldr.io
 *
 * (c) Aaron Scherer <aequasi@gmail.com>
 *
 * This source file is subject to the MIT license that is bundled
 * with this source code in the file LICENSE
 */
 
namespace Bldr\Test\DependencyInjection;
 
use Bldr\DependencyInjection\ContainerBuilder;
use org\bovigo\vfs\vfsStream;
 
/**
 * @author Luis Cordova <cordoval@gmail.com>
 */
class ContainerBuilderTest extends \PHPUnit_Framework_TestCase
{
    /**
     * @var ContainerBuilder
     */
    protected $containerBuilder;
    protected $application;
    protected $input;
    protected $output;
    protected $root;
 
    public function setUp()
    {
        $this->application = $this->getMockBuilder('Bldr\Application') ...
        // ...
 
        $this->root = vfsStream::setup();
    }
 
    public function testGetThirdPartyBlocks()
    {
        $embeddedComposer = $this->getMockBuilder('Dflydev\EmbeddedComposer\Core\EmbeddedComposer')
        // ...
 
        $config = $this->getMockBuilder('Composer\Config')
        // ...
 
        $config->expects($this->once())
            ->method('has')
            ->with('block-loader')
            ->willReturn(true)
        ;
        $config->expects($this->once())
            ->method('get')
            ->with('block-loader')
            ->willReturn('build/blocks.yml')
        ;
        $embeddedComposer
            ->expects($this->once())
            ->method('getExternalComposerConfig')
            ->willReturn($config)
        ;
        $embeddedComposer
            ->expects($this->once())
            ->method('getExternalRootDirectory')
            ->willReturn(vfsStream::url('root'))
        ;
        $this->application
            ->expects($this->once())
            ->method('getEmbeddedComposer')
            ->willReturn($embeddedComposer)
        ;
 
        $bldrFolder = vfsStream::newDirectory('build')->at($this->root);
        vfsStream::newFile('blocks.yml')
            ->withContent('[ \stdClass, \stdClass ]')
            ->at($bldrFolder)
        ;
 
        $this->containerBuilder = new ContainerBuilder(
            $this->application,
            $this->input,
            $this->output
        );
 
        $this->assertCount(2, $this->containerBuilder->getThirdPartyBlocks());
    }
}

I have tried to shorten the mocking part, but you can see clearly that to match a file_exists($file) call from the code we have to create into the vfs system a root project directory and then the folders using the vfsStream::method API. It is pretty simple. One of the things in which I struggle was that for the root project folder you have to provide the reference always, i.e. vfsStream::url(‘root’)).

Now we see there is not a real file created or anything. All is done via the little library vfsStream that plays well with phpunit.

Hope it helps! Please retweet, and after if you want to see the code I have the link to the PR on the top part of this blog post. Thanks for reading!

Arrivederchi Ciao Chau Symfony … Welcome Standalone Dumper & C Extension

You start with requiring the dependency on the component like this:

"patchwork/dumper": "~1.2@dev",

The component also does not get tagged very often so you will need master to get the function twig below explained. You can install it also together with ladybug to compare :).

After adding the above:

"raulfraile/ladybug": "~1.0.8",
"patchwork/dumper": "~1.2@dev",

You can require the bundle from ladybug too if you want for the twig function. You can also plug the dumper twig function plugging the bundle that already comes within the patchwork/dumper:

if ('dev' === $this->getEnvironment()) {
            // ...
            $bundles[] = new Symfony\Bundle\DebugBundle\DebugBundle();
            $bundles[] = new Symfony\Bundle\WebProfilerBundle\WebProfilerBundle();
            // ...
        }

Of course as usual I do not recommend to load bundles. So what I do instead is to plug the twig extension straight to the kernel if possible. Now it is time to focus on the component. After this DebugBundle is plugged you can invoke it from your view too:

The function dumps to the same view the variable var:

{{ dump(var) }}

Where as the dump tag only presents a screenshot in miniature on the symfony WDT (Web Development Toolbar) and waits for you to click there so that it will take you to the sidebar-ed view in white that is also expandable:

{% dump var %}

I fully prefer the dumping directly on the view since it is less time wasted. Also it works the same as in the cli so there is nothing new to learn. I tested side to side to ladybug and I think ladybug still has some information that can get expanded and it is what I am more accustomed to. However ladybug often breaks whereas this component is very solid and integrates nicely with symfony and has a dark background which I like very much.

Some examples borrowed from the PR in symfony follow. Notice the beautiful dumping of the container:

Source: patchwork dumper component on github.

Now a comparison side to side of dump vs var_dump:

Screenshot 2014-09-17 21.20.32 Source: patchwork dumper component on github and my desktop screenshot.

Ladybug colors fall short a bit, but still is a good option. However the reach of the dumper component is way beyond since it comes with a C extension that can be compiled:

 phpize
./configure
make
sudo make install

However honestly until this is easy to install as:

brew install php56-symfony_debug

I will not be able to use it nor I want to use it since I rather fallback to the php implementation.

The view of ladybug is expandable in a nice way:

Source: Github LadybugBundle.

Notice ladybug has also a bundle but I rather don’t use the bundle though sometimes it comes handy for a dump on the view. The thing is I think many other packages would like to use this component totally independent from symfony, however creating a dependency on this will pull several other dependencies right away:

"require": {
        "php": ">=5.3.3",
        "symfony/config": "~2.4",
        "symfony/dependency-injection": "~2.2",
        "symfony/event-dispatcher": "~2.3",
        "symfony/http-kernel": "~2.3"
    },
    "require-dev": {
        "twig/twig": "~1.12"
    },

Really a bunch more since the DI thing and others will be a load of things.

What I think I will use in my projects will be the component but will try to untie as much stuff as possible and wire things with say the aura DI or try at least. The component in itself should just be invokable via:

use Symfony\Component\VarDumper\VarDumper as Dumper;
 
Dumper::dump($var);

And here is the implementation detail:

    public static function dump($var)
    {
        if (null === self::$handler) {
            $cloner = extension_loaded('symfony_debug') ? new ExtCloner() : new PhpCloner();
            $dumper = 'cli' === PHP_SAPI ? new CliDumper() : new HtmlDumper();
            self::$handler = function ($var) use ($cloner, $dumper) {
                $dumper->dump($cloner->cloneVar($var));
            };
        }
 
        return call_user_func(self::$handler, $var);
    }
 
    public static function setHandler($callable)
    {
        if (null !== $callable && !is_callable($callable, true)) {
            throw new \InvalidArgumentException('Invalid PHP callback.');
        }
 
        $prevHandler = self::$handler;
        self::$handler = $callable;
 
        return $prevHandler;
    }

Let the reader understand a bit. Basically it uses a cloner within the handler that gets called finally to dump the variable $var. It switches between the html or cli dumpers accordingly. And depending on if the C extension is preloaded it will choose the respective XCloner::class. The high level handler callable will use the respective implementations to do the final job invoking each.

That is it! So before 2.6, even now you can take advantage of this nice component and plug it into your say AuraPHP project or other. It will rock and you will not have to wait for 2.6 release nor for the big merge of the PR that smashes it into the whole of bulk of symfony/symfony.

If someone plugs the C extension on brew please let me know!

Look Mama: No Console Coupling! Bye Symfony, Welcome Aura Cli!

First don’t loose sight of the Dependency Injection part on any project! This is the tip of the iceberg. If you have buried it under the gravel of a foreign abstraction or a framework imposed layer, then beware! you are loosing a bit of control that may be dangerous for decoupling. The best thing is when the DI and other components are truly decoupled. Your services are meant to be decoupled that is what makes DI great! If that is not the case then blame your framework in the full sense of the word.

Let me give you a clear example. Let me show you how we can build an easy decoupled console app with only 3 dependencies that are completely decoupled.

Suppose in our composer.json we require a DI, a cli component that has support to extract the context values from the prompt, and a dispatcher so we can decouple this first step of extraction with the actual implementation of services which will be each command. Welcome to AuraPHP 2.0 components!

    "require": {
        "aura/di": "2.*@dev",
        "aura/cli": "2.*@dev",
        "aura/dispatcher": "2.*@dev"
    },

Let’s now start sketching our console.php script:

<?php
 
use Aura\Di\ContainerBuilder;
use Cordoval\Console\Config;
use Cordoval\Console\Runner;
 
require __DIR__.'/vendor/autoload.php';
 
exit((new ContainerBuilder())
    ->newInstance([], [Config::class])
    ->newInstance(Runner::class)
    ->run()
);

Yes, it is very simple and looks beautiful. Let’s explain what is happening under the hood. Remember I said to keep DI layer the top one by any means? Well that is the container!

To build the services inside our container we pass a Config class where we define our services using the DI component:

<?php
 
namespace Cordoval\Console;
 
use Aura\Cli\_Config\Common;
use Aura\Cli\Context;
use Aura\Cli\Stdio;
use Aura\Di\Container;
use Aura\Dispatcher\Dispatcher;
use Cordoval\Console\Task\Help;
 
class Config extends Common
{
    public function define(Container $di)
    {
        parent::define($di);
 
        $di->set('cordoval:stdio', $di->lazyNew(Stdio::class));
        $di->set('cordoval:context', $di->lazyNew(Context::class));
        $di->set(
            'cordoval:dispatcher',
            $di->lazyNew(Dispatcher::class, ['object_param' => 'command'])
        );
 
        $di->params[Runner::class] = [
            'dispatcher' => $di->lazyGet('cordoval:dispatcher'),
            'context' => $di->lazyGet('cordoval:context'),
        ];
 
        $di->params[Help::class] = [
            'context' => $di->lazyGet('cordoval:context'),
            'stdio' => $di->lazyGet('cordoval:stdio'),
        ];
    }
 
    public function modify(Container $di)
    {
        $di->get('cordoval:dispatcher')
            ->setObject('help', $di->lazyNew(Help::class))
        ;
    }
}

Notice the beauty of service definition and lazy loading. The first method is called first before freezing the container. Then it is all about modifying services. The base class from which we extend defines the basic building blocks from the global variables. This is the context information and is important for the cli operation.

<?php
namespace Aura\Cli\_Config;
 
use Aura\Di\Config;
use Aura\Di\Container;
 
class Common extends Config
{
    public function define(Container $di)
    {
        /**
         * Aura\Cli\Context\Argv
         */
        $di->params['Aura\Cli\Context\Argv'] = array(
            'values' => (isset($_SERVER['argv']) ? $_SERVER['argv'] : array()),
        );
 
        /**
         * Aura\Cli\Context\Env
         */
        $di->params['Aura\Cli\Context\Env'] = array(
            'values' => $_ENV,
        );
 
        /**
         * Aura\Cli\Context\Server
         */
        $di->params['Aura\Cli\Context\Server'] = array(
            'values' => $_SERVER,
        );
 
        /**
         * Aura\Cli\Stdio
         */
        $di->params['Aura\Cli\Stdio'] = array(
            'stdin' => $di->lazyNew('Aura\Cli\Stdio\Handle', array(
                'name' => 'php://stdin',
                'mode' => 'r',
            )),
            'stdout' => $di->lazyNew('Aura\Cli\Stdio\Handle', array(
                'name' => 'php://stdout',
                'mode' => 'w+',
            )),
            'stderr' => $di->lazyNew('Aura\Cli\Stdio\Handle', array(
                'name' => 'php://stderr',
                'mode' => 'w+',
            )),
            'formatter' => $di->lazyNew('Aura\Cli\Stdio\Formatter'),
        );
    }
}

Now let’s rewind back to our script at the beginning:

exit((new ContainerBuilder())
    ->newInstance([], [Config::class])
    ->newInstance(Runner::class)
    ->run()
);

We already know that we created a container and plug services on it. The services are all that we have seen above. Things useful to turn prompt context into parameters and the name of the command to call, and things useful also to dispatch to that command, yes the dispatcher. And of course the command itself to which we now turn. So the instance of the DI container instantiates in turn the Runner class. This is our service defined in our namespace Cordoval\Console\Runner. Basically the specification of this for us is a service that will take in the Context service and the Dispatcher and uses these to gather the parameters and command name and dispatches to the callable that is the other command registered. The help command. We execute the run() method on the Runner as you can see:

<?php
 
namespace Cordoval\Console;
 
use Aura\Dispatcher\Dispatcher;
use Aura\Cli\Context;
 
/**
 * It loads cli context from prompt and dispatches to registered callable commands
 */
final class Runner
{
    private $dispatcher;
    private $context;
 
    public function __construct(Dispatcher $dispatcher, Context $context)
    {
        $this->dispatcher = $dispatcher;
        $this->context = $context;
    }
 
    public function run()
    {
        list($params, $command) = $this->loadContext();
 
        return (int) $this->dispatcher->__invoke($params, $command);
    }
 
    public function loadContext()
    {
        $params = $this->context->argv->get();
        array_shift($params);
        $command = array_shift($params);
 
        return [$params, $command];
    }
}

So this becomes very simple to read. Basically we load the context parameters and decipher the command name invoked, that is “`php console.php help“` will do invoke our help command.

        $di->params[Help::class] = [
            'context' => $di->lazyGet('cordoval:context'),
            'stdio' => $di->lazyGet('cordoval:stdio'),
        ];

Notice that we define our class as a parameter in the DI container and lazyload its 2 dependencies to be injected. That is the service period. The DI is amazingly versatile to do the heavy lifting for us. The second part in the modify() method is so to bind the callable Help::class to be invoked by name ‘help':

        $di->get('cordoval:dispatcher')
            ->setObject('help', $di->lazyNew(Help::class))
        ;

It is wonderful now :) Our command will be invoked whenever there is a dispatch on that name for that callable. Just picture command handlers here and you will start to grin.

Here is our command Help:class:

<?php
 
namespace Cordoval\Console\Task;
 
use Aura\Cli\Stdio;
use Aura\Cli\Context;
use Aura\Cli\Status;
 
class Help
{
    private $context;
    private $stdio;
 
    public function __construct(Context $context, Stdio $stdio)
    {
        $this->context = $context;
        $this->stdio = $stdio;
    }
 
    public function __invoke()
    {
        $this->stdio->outln('all good boss');
 
        return Status::USAGE;
    }
}

Dead simple, let it just be a callable. Is like an interactor, ready to be tested like one completely decoupled. The importance thing to see here is you are not relying on magical links between the console part (aka the Runner::class) and the command. The input/output stuff is encapsulated at the level that it should be. There is no messy confusion, all is crystal clear for you to adapt/modify. In fact, I did just that because AuraPHP already provides an example of Aura.Cli_Kernel and Aura.Cli_Project which tie some dependencies including a logger and other things minor. But in reality is that this is just for educational purposes for you to bootstrap quicker the project and understand better the internals of AuraPHP 2.0.

However, after seeing this there is no need if you understand the essence of this. You can do away with complexity and keep it simple. You can decouple your commands from any framework, the DI from AuraPHP is just a sauce on top of the real domain of your commands. Your code is free from strange interfaces other than the basic final-like classes such as Context, Stdio, and the ones you decide to keep injecting.

As a matter of fact, we are trying to rewrite GushPHP with libraries like this, because is much healthier for maintenance long term and inner quality. Symfony has its place but when it comes to decoupling AuraPHP 2.0 is superior. We still can use symfony components though I see no need really for now. Furthermore, you can now keep using tools like PhpSpec to concentrate on your classes, the collaborators of your commands, rather than concentrating on a TestCommand class or what not, or in functional tests which don’t make any sense when your commands can way much more benefit from a decoupled components first approach.

Enjoy and retweet please! Thanks!