+201223538180

Web site Developer I Advertising and marketing I Social Media Advertising and marketing I Content material Creators I Branding Creators I Administration I System SolutionCreating A Public/Non-public Multi-Monorepo For PHP Tasks — Smashing Journal

Web site Developer I Advertising and marketing I Social Media Advertising and marketing I Content material Creators I Branding Creators I Administration I System SolutionCreating A Public/Non-public Multi-Monorepo For PHP Tasks — Smashing Journal

Web site Developer I Advertising and marketing I Social Media Advertising and marketing I Content material Creators I Branding Creators I Administration I System Resolution

Fast abstract ↬

Let’s see the right way to use a “multi-monorepo” strategy for making the event expertise sooner, but protecting your PHP packages non-public. This resolution may be particularly useful for PRO plugin creators.

To make the event expertise sooner, I moved all of the PHP packages required by my tasks to a monorepo. When every package deal is hosted by itself repo (the “multirepo” strategy), it’d want be developed and examined by itself, after which revealed to Packagist earlier than I might set up it on different packages through Composer. With the monorepo, as a result of all packages are hosted collectively, these may be developed, examined, versioned and launched on the similar time.

The monorepo internet hosting my PHP packages is public, accessible to anybody on GitHub. Git repos can not grant completely different entry to completely different property, it’s all both public or non-public. As I plan to launch a PRO WordPress plugin, I would like its packages to be saved non-public, which means they’ll’t be added to the general public monorepo.

The answer I discovered is to make use of a “multi-monorepo” strategy, comprising two monorepos: one public and one non-public, with the non-public monorepo embedding the general public one as a Git submodule, permitting it to entry its information. The general public monorepo may be thought-about the “upstream”, and the non-public monorepo the “downstream”.

Architecture of a multi-monorepo

Structure of a multi-monorepo. (Giant preview)

As my saved iterating on my code, the repo set-up I wanted to make use of at every stage of my mission additionally wanted to be upgraded. Therefore, I didn’t arrive on the multi-monorepo strategy on day 1, nevertheless it was a course of that spanned a number of years and took its honest quantity of effort, going from a single repo, to a number of repos, to the monorepo, to lastly the multi-monorepo.

On this article I’ll describe how I set-up my multi-monorepo utilizing the Monorepo builder, which works for PHP tasks based mostly on Composer.

Extra after soar! Proceed studying under ↓

Reusing Code In The Multi-Monorepo

The general public monorepo leoloso/PoP is the place I maintain all my PHP tasks.

This monorepo incorporates workflow generate_plugins.yml, which generates a number of WordPress plugins for distribution when creating a brand new launch on GitHub:

Generating plugins when creating a release

Producing plugins when making a launch. (Giant preview)

The workflow configuration will not be hard-coded inside the YAML however injected through PHP code:

  - id: output_data
    run: |
      echo "::set-output title=plugin_config_entries::$(vendor/bin/monorepo-builder plugin-config-entries-json)"

And the configuration is offered through a customized PHP class:

class PluginDataSource
{
  public perform getPluginConfigEntries(): array
  {
    return [
      // GraphQL API for WordPress
      [
        'path' => 'layers/GraphQLAPIForWP/plugins/graphql-api-for-wp',
        'zip_file' => 'graphql-api.zip',
        'main_file' => 'graphql-api.php',
        'dist_repo_organization' => 'GraphQLAPI',
        'dist_repo_name' => 'graphql-api-for-wp-dist',
      ],
      // GraphQL API - Extension Demo
      [
        'path' => 'layers/GraphQLAPIForWP/plugins/extension-demo',
        'zip_file' => 'graphql-api-extension-demo.zip',
        'main_file' => 'graphql-api-extension-demo.php',
        'dist_repo_organization' => 'GraphQLAPI',
        'dist_repo_name' => 'extension-demo-dist',
      ],
    ];
  }
}

Producing a number of WordPress plugins all collectively, and configuring the workflow through PHP, has diminished the period of time wanted managing the mission. The workflow at the moment handles two plugins (the GraphQL API and its extension demo), nevertheless it might deal with 200 with out extra effort on my aspect.

It’s this set-up that I need to reuse for my non-public monorepo leoloso/GraphQLAPI-PRO, in order that the PRO plugins may also be generated with out effort.

The code to reuse will comprise:

The non-public monorepo can then generate the PRO WordPress plugins, just by triggering the workflows from the general public monorepo, and overriding their configuration in PHP.

Linking Monorepos By way of Git Submodules

To embed the general public repo inside the non-public one we use Git submodules:

git submodule add <public repo URL>

I embedded the general public repo beneath subfolder submodules of the non-public monorepo, permitting me so as to add extra upstream monorepos sooner or later if wanted. In GitHub, the folder shows the submodule’s particular commit, and clicking on it should take me to that commit on leoloso/PoP:

Embedding the public monorepo within the private monorepo

Embedding the general public monorepo inside the non-public monorepo. (Giant preview)

Because it incorporates submodules, to clone the non-public repo we should present the --recursive possibility:

git clone --recursive <non-public repo URL>

Reusing The GitHub Actions Workflows

GitHub Actions solely masses workflows from beneath .github/workflows. As a result of the general public workflows within the downstream monorepo are are beneath submodules/PoP/.github/workflows, these should be duplicated into the anticipated location.

To be able to maintain the upstream workflows as the one supply of fact, we will restrict ourselves to copying the information to downstream beneath .github/workflows, however by no means edit them there. If there’s any change to be performed, it should be performed within the upstream monorepo, after which copied over.

As a aspect observe, discover how which means the multi-monorepo leaks: the upstream monorepo will not be absolutely autonomous, and can have to be tailored to go well with the downstream monorepo.

In my first iteration to repeat the workflows, I created a easy Composer script:


{
  "scripts": {
    "copy-workflows": [
      "php -r "copy('submodules/PoP/.github/workflows/generate_plugins.yml', '.github/workflows/generate_plugins.yml');"",
      "php -r "copy('submodules/PoP/.github/workflows/split_monorepo.yaml', '.github/workflows/split_monorepo.yaml');""
    ]
  }
}

Then, after enhancing the workflows within the upstream monorepo, I might copy them to downstream by executing:

composer copy-workflows

However then I spotted that simply copying the workflows will not be sufficient: they need to even be modified within the course of. That is so as a result of testing the downstream monorepo requires possibility --recurse-submodules, as to additionally checkout the submodules.

In GitHub Actions, the checkout for downstream is finished like this:

  - makes use of: actions/checkout@v2
    with:
        submodules: recursive

So testing the downstream repo wants enter submodules: recursive, however the upstream one doesn’t, and so they each use the identical supply file.

The answer I discovered is to offer the worth for enter submodules through an setting variable CHECKOUT_SUBMODULES, which is by default empty for the upstream repo:

env:
  CHECKOUT_SUBMODULES: ""
  
jobs:
  provide_data:
    steps:
      - makes use of: actions/checkout@v2
        with:
          submodules: ${{ env.CHECKOUT_SUBMODULES }}

Then, when copying the workflows from upstream to downstream, the worth of CHECKOUT_SUBMODULES is changed with "recursive":

env:
  CHECKOUT_SUBMODULES: "recursive"

When modifying the workflow, it’s a good suggestion to make use of a regex, in order that it really works for various codecs within the supply file (akin to CHECKOUT_SUBMODULES: "" or CHECKOUT_SUBMODULES:'' or CHECKOUT_SUBMODULES:) as to not create bugs from this type of assumed-to-be-harmless adjustments.

Then, the copy-workflows Composer script seen above will not be ok to deal with this complexity.

In my subsequent iteration, I created a PHP command CopyUpstreamMonorepoFilesCommand, to be executed through the Monorepo builder:

vendor/bin/monorepo-builder copy-upstream-monorepo-files

This command makes use of a customized service FileCopierSystem to repeat all information from a supply folder to the indicated vacation spot, whereas optionally changing their contents:

namespace PoPGraphQLAPIPROExtensionsSymplifyMonorepoBuilderSmartFile;

use NetteUtilsStrings;
use SymplifySmartFileSystemFinderSmartFinder;
use SymplifySmartFileSystemSmartFileSystem;

ultimate class FileCopierSystem
{
  public perform __construct(
    non-public SmartFileSystem $smartFileSystem,
    non-public SmartFinder $smartFinder,
  ) {
  }

  /**
   * @param array $patternReplacements a regex sample to go looking, and its alternative
   */
  public perform copyFilesFromFolder(
    string $fromFolder,
    string $toFolder,
    array $patternReplacements = []
  ): void {
    $smartFileInfos = $this->smartFinder->discover([$fromFolder], '*');

    foreach ($smartFileInfos as $smartFileInfo) {
      $fromFile = $smartFileInfo->getRealPath();
      $fileContent = $this->smartFileSystem->readFile($fromFile);

      foreach ($patternReplacements as $sample => $alternative) {
        $fileContent = Strings::change($fileContent, $sample, $alternative);
      }

      $toFile = $toFolder . substr($fromFile, strlen($fromFolder));
      $this->smartFileSystem->dumpFile($toFile, $fileContent);
    }
  }
}

When invoking this methodology to repeat all workflows downstream, I additionally change the worth of CHECKOUT_SUBMODULES:

/**
 * Copy all workflows to `.github/`, and convert:
 *   `CHECKOUT_SUBMODULES: ""`
 * into:
 *   `CHECKOUT_SUBMODULES: "recursive"`
 */
$regexReplacements = [
  '#CHECKOUT_SUBMODULES:(s+".*")?#' => 'CHECKOUT_SUBMODULES: "recursive"',
];
(new FileCopierSystem())->copyFilesFromFolder(
  'submodules/PoP/.github/workflows',
  '.github/workflows',
  $regexReplacements
);

Workflow generate_plugins.yml wants an extra alternative. When the WordPress plugin is generated, its code is downgraded from PHP 8.0 to 7.1 by invoking script ci/downgrade/downgrade_code.sh:

  - title: Downgrade code for manufacturing (to PHP 7.1)
    run: ci/downgrade/downgrade_code.sh "${{ matrix.pluginConfig.rector_downgrade_config }}" "" "${{ matrix.pluginConfig.path }}" "${{ matrix.pluginConfig.additional_rector_configs }}"

Within the downstream monorepo, this file might be positioned beneath submodules/PoP/ci/downgrade/downgrade_code.sh. Then, we’ve the downstream workflow level to the precise path with this alternative:

$regexReplacements = [
  // ...
  '#(ci/downgrade/downgrade_code.sh)#' => 'submodules/PoP/$1',
];

Configuring Packages In Monorepo Builder

File monorepo-builder.php — positioned on the root of the monorepo — holds the configuration for the Monorepo builder. In it we should point out the place the packages (and plugins, shoppers, or anything) are positioned:

use SymfonyComponentDependencyInjectionLoaderConfiguratorContainerConfigurator;
use SymplifyMonorepoBuilderValueObjectOption;

return static perform (ContainerConfigurator $containerConfigurator): void {
  $parameters = $containerConfigurator->parameters();
  $parameters->set(Choice::PACKAGE_DIRECTORIES, [
    __DIR__ . '/packages',
    __DIR__ . '/plugins',
  ]);
};

The non-public monorepo should have entry to all code: its personal packages, plus these from the general public monorepo. Then, it should outline all packages from each monorepos within the config file. Those from the general public monorepo are positioned beneath "/submodules/PoP":

return static perform (ContainerConfigurator $containerConfigurator): void {
  $parameters = $containerConfigurator->parameters();
  $parameters->set(Choice::PACKAGE_DIRECTORIES, [
    // public code
    __DIR__ . '/submodules/PoP/packages',
    __DIR__ . '/submodules/PoP/plugins',
    // private code
    __DIR__ . '/packages',
    __DIR__ . '/plugins',
    __DIR__ . '/clients',
  ]);
};

As it may be seen, the configuration for upstream and downstream are just about the identical, with the distinction that the downstream one will:

  • Change the trail to the general public packages.
  • Add the non-public packages.

Then, it is sensible to rewrite the configuration utilizing object-oriented programming, in order that we make code DRY (don’t repeat your self) by having a PHP class within the public repo be prolonged within the non-public repo.

Recreating The Configuration By way of OOP

Let’s refactor the configuration. Within the public repo, file monorepo-builder.php will merely reference a brand new class ContainerConfigurationService the place all motion will occur:

use PoPPoPConfigSymplifyMonorepoBuilderConfiguratorsContainerConfigurationService;
use SymfonyComponentDependencyInjectionLoaderConfiguratorContainerConfigurator;

return static perform (ContainerConfigurator $containerConfigurator): void {
  $containerConfigurationService = new ContainerConfigurationService(
    $containerConfigurator,
    __DIR__
  );
  $containerConfigurationService->configureContainer();
};

The __DIR__ param factors to the basis of the monorepo. It is going to be wanted to acquire the total path to the package deal directories.

Class ContainerConfigurationService is now in command of producing the configuration:

namespace PoPPoPConfigSymplifyMonorepoBuilderConfigurators;

use PoPPoPConfigSymplifyMonorepoBuilderDataSourcesPackageOrganizationDataSource;
use SymfonyComponentDependencyInjectionLoaderConfiguratorContainerConfigurator;
use SymplifyMonorepoBuilderValueObjectOption;

class ContainerConfigurationService
{
  public perform __construct(
    protected ContainerConfigurator $containerConfigurator,
    protected string $rootDirectory,
  ) {
  }

  public perform configureContainer(): void
  {
    $parameters = $this->containerConfigurator->parameters();
    if ($packageOrganizationConfig = $this->getPackageOrganizationDataSource($this->rootDirectory)) {
      $parameters->set(
        Choice::PACKAGE_DIRECTORIES,
        $packageOrganizationConfig->getPackageDirectories()
      );
    }
  }

  protected perform getPackageOrganizationDataSource(): ?PackageOrganizationDataSource
  {
    return new PackageOrganizationDataSource($this->rootDirectory);
  }
}

The configuration may be cut up throughout a number of courses. On this case, ContainerConfigurationService retrieves the package deal configuration by way of class PackageOrganizationDataSource, which has this implementation:

namespace PoPPoPConfigSymplifyMonorepoBuilderDataSources;

class PackageOrganizationDataSource
{
  public perform __construct(protected string $rootDir)
  {
  }

  public perform getPackageDirectories(): array
  {
    return array_map(
      fn (string $packagePath) => $this->rootDir . "https://smashingmagazine.com/" . $packagePath,
      $this->getRelativePackagePaths()
    );
  }

  public perform getRelativePackagePaths(): array
  {
    return [
      'packages',
      'plugins',
    ];
  }
}

Overriding The Configuration In The Downstream Monorepo

Now that the configuration within the public monorepo is setup through OOP, we will prolong it to go well with the wants of the non-public monorepo.

To be able to permit the non-public monorepo to autoload the PHP code from the general public monorepo, we should first configure the downstream composer.json to reference the supply code from the upstream, which is beneath path submodules/PoP/src:

{
  "autoload": {
    "psr-4": {
      "PoPGraphQLAPIPRO": "src",
      "PoPPoP": "submodules/PoP/src"
    }
  }
}

Beneath is file monorepo-builder.php for the non-public monorepo. Discover that the referenced class ContainerConfigurationService within the upstream repo belongs to the PoPPoP namespace, however now it switched to the PoPGraphQLAPIPRO namespace. This class should obtain the extra enter $upstreamRelativeRootPath (with worth "submodules/PoP") as to recreate the total path to the general public packages:

use PoPGraphQLAPIPROConfigSymplifyMonorepoBuilderConfiguratorsContainerConfigurationService;
use SymfonyComponentDependencyInjectionLoaderConfiguratorContainerConfigurator;

return static perform (ContainerConfigurator $containerConfigurator): void {
  $containerConfigurationService = new ContainerConfigurationService(
    $containerConfigurator,
    __DIR__,
    'submodules/PoP'
  );
  $containerConfigurationService->configureContainer();
};

The downstream class ContainerConfigurationService overrides which PackageOrganizationDataSource class is used within the configuration:

namespace PoPGraphQLAPIPROConfigSymplifyMonorepoBuilderConfigurators;

use PoPPoPConfigSymplifyMonorepoBuilderConfiguratorsContainerConfigurationService as UpstreamContainerConfigurationService;
use PoPGraphQLAPIPROConfigSymplifyMonorepoBuilderDataSourcesPackageOrganizationDataSource;
use SymfonyComponentDependencyInjectionLoaderConfiguratorContainerConfigurator;

class ContainerConfigurationService extends UpstreamContainerConfigurationService
{
  public perform __construct(
    ContainerConfigurator $containerConfigurator,
    string $rootDirectory,
    protected string $upstreamRelativeRootPath
  ) {
    dad or mum::__construct(
      $containerConfigurator,
      $rootDirectory
    );
  }

  protected perform getPackageOrganizationDataSource(): ?PackageOrganizationDataSource
  {
    return new PackageOrganizationDataSource(
      $this->rootDirectory,
      $this->upstreamRelativeRootPath
    );
  }
}

Lastly, downstream class PackageOrganizationDataSource incorporates the total path to each private and non-private packages:

namespace PoPGraphQLAPIPROConfigSymplifyMonorepoBuilderDataSources;

use PoPPoPConfigSymplifyMonorepoBuilderDataSourcesPackageOrganizationDataSource as UpstreamPackageOrganizationDataSource;

class PackageOrganizationDataSource extends UpstreamPackageOrganizationDataSource
{
  public perform __construct(
    string $rootDir,
    protected string $upstreamRelativeRootPath
  ) {
    dad or mum::__construct($rootDir);
  }

  public perform getRelativePackagePaths(): array
  {
    return array_merge(
      // Public packages - Prepend them with "submodules/PoP/"
      array_map(
        fn ($upstreamPackagePath) => $this->upstreamRelativeRootPath . "https://smashingmagazine.com/" . $upstreamPackagePath,
        dad or mum::getRelativePackagePaths()
      ),
      // Non-public packages
      [
        'packages',
        'plugins',
        'clients',
      ]
    );
  }
}

Injecting The Configuration From PHP Into GitHub Actions

Monorepo builder provides command packages-json, which we will use to inject the package deal paths into the GitHub Actions workflow:

jobs:
  provide_data:
    steps:
      - id: output_data
        title: Calculate matrix for packages
        run: |
          echo "::set-output title=matrix::$(vendor/bin/monorepo-builder packages-json)"

    outputs:
      matrix: ${{ steps.output_data.outputs.matrix }}

This command produces a stringified JSON. Within the workflow it should be transformed to a JSON object through fromJson:

jobs:
  split_monorepo:
    wants: provide_data
    technique:
      matrix:
        package deal: ${{ fromJson(wants.provide_data.outputs.matrix) }}

Sadly, command packages-json outputs the package deal names however not their paths, which works when all packages are beneath the identical folder (akin to packages/). It doesn’t work in our case, since private and non-private packages are positioned in several folders.

Fortuitously, the Monorepo builder may be prolonged with customized PHP providers. So I created a customized command package-entries-json (through class PackageEntriesJsonCommand) which does output the trail to the package deal.

The workflow was then up to date with the brand new command:

    run: |
      echo "::set-output title=matrix::$(vendor/bin/monorepo-builder package-entries-json)"

Executed on the general public monorepo, it produces the next packages (amongst many others):

[
  {
    "name": "graphql-api-for-wp",
    "path": "layers/GraphQLAPIForWP/plugins/graphql-api-for-wp"
  },
  {
    "name": "extension-demo",
    "path": "layers/GraphQLAPIForWP/plugins/extension-demo"
  },
  {
    "name": "access-control",
    "path": "layers/Engine/packages/access-control"
  },
  {
    "name": "api",
    "path": "layers/API/packages/api"
  },
  {
    "name": "api-clients",
    "path": "layers/API/packages/api-clients"
  }
]

Executed on the non-public monorepo, it produces the next entries (amongst many others):

[
  {
    "name": "graphql-api-for-wp",
    "path": "submodules/PoP/layers/GraphQLAPIForWP/plugins/graphql-api-for-wp"
  },
  {
    "name": "extension-demo",
    "path": "submodules/PoP/layers/GraphQLAPIForWP/plugins/extension-demo"
  },
  {
    "name": "access-control",
    "path": "submodules/PoP/layers/Engine/packages/access-control"
  },
  {
    "name": "api",
    "path": "submodules/PoP/layers/API/packages/api"
  },
  {
    "name": "api-clients",
    "path": "submodules/PoP/layers/API/packages/api-clients"
  },
  {
    "name": "graphql-api-pro",
    "path": "layers/GraphQLAPIForWP/plugins/graphql-api-pro"
  },
  {
    "name": "convert-case-directives",
    "path": "layers/Schema/packages/convert-case-directives"
  },
  {
    "name": "export-directive",
    "path": "layers/GraphQLByPoP/packages/export-directive"
  }
]

As it may be appreciated, it really works properly: the configuration for the downstream monorepo incorporates each private and non-private packages, and the paths to the general public ones have been prepended with "submodules/PoP".

Skipping Public Packages In The Downstream Monorepo

To this point, the downstream monorepo has included each private and non-private packages in its configuration. Nonetheless, not each command must be executed on the general public packages.

Take static evaluation, as an example. The general public monorepo already executes PHPStan on all public packages through workflow phpstan.yml, as proven in this run. If the downstream monorepo runs as soon as once more PHPStan on the general public packages, it’s a waste of computing time. Then, the phpstan.yml workflow must run on the non-public packages solely.

That signifies that relying on the command to execute within the downstream repo, we could need to both embrace each private and non-private packages, or solely non-public ones.

So as to add public packages or not on the downstream configuration, we adapt downstream class PackageOrganizationDataSource to test this situation through enter $includeUpstreamPackages:

namespace PoPGraphQLAPIPROConfigSymplifyMonorepoBuilderDataSources;

use PoPPoPConfigSymplifyMonorepoBuilderDataSourcesPackageOrganizationDataSource as UpstreamPackageOrganizationDataSource;

class PackageOrganizationDataSource extends UpstreamPackageOrganizationDataSource
{
  public perform __construct(
    string $rootDir,
    protected string $upstreamRelativeRootPath,
    protected bool $includeUpstreamPackages
  ) {
    dad or mum::__construct($rootDir);
  }

  public perform getRelativePackagePaths(): array
  {
    return array_merge(
      // Add the general public packages?
      $this->includeUpstreamPackages ?
        // Public packages - Prepend them with "submodules/PoP/"
        array_map(
          fn ($upstreamPackagePath) => $this->upstreamRelativeRootPath . "https://smashingmagazine.com/" . $upstreamPackagePath,
          dad or mum::getRelativePackagePaths()
        ) : [],
      // Non-public packages
      [
        'packages',
        'plugins',
        'clients',
      ]
    );
  }
}

Subsequent, we have to present worth $includeUpstreamPackages as both true or false relying on the command to execute.

We will do that by changing config file monorepo-builder.php with two different config information: monorepo-builder-with-upstream-packages.php (which passes $includeUpstreamPackages => true) and monorepo-builder-without-upstream-packages.php (which passes $includeUpstreamPackages => false):

// File monorepo-builder-without-upstream-packages.php
use PoPGraphQLAPIPROConfigSymplifyMonorepoBuilderConfiguratorsContainerConfigurationService;
use SymfonyComponentDependencyInjectionLoaderConfiguratorContainerConfigurator;

return static perform (ContainerConfigurator $containerConfigurator): void {
  $containerConfigurationService = new ContainerConfigurationService(
    $containerConfigurator,
    __DIR__,
    'submodules/PoP',
    false, // That is $includeUpstreamPackages
  );
  $containerConfigurationService->configureContainer();
};

We then replace ContainerConfigurationService to obtain parameter $includeUpstreamPackages and cross it alongside to PackageOrganizationDataSource:

namespace PoPGraphQLAPIPROConfigSymplifyMonorepoBuilderConfigurators;

use PoPPoPConfigSymplifyMonorepoBuilderConfiguratorsContainerConfigurationService as UpstreamContainerConfigurationService;
use PoPGraphQLAPIPROConfigSymplifyMonorepoBuilderDataSourcesPackageOrganizationDataSource;
use SymfonyComponentDependencyInjectionLoaderConfiguratorContainerConfigurator;

class ContainerConfigurationService extends UpstreamContainerConfigurationService
{
  public perform __construct(
    ContainerConfigurator $containerConfigurator,
    string $rootDirectory,
    protected string $upstreamRelativeRootPath,
    protected bool $includeUpstreamPackages,
  ) {
    dad or mum::__construct(
      $containerConfigurator,
      $rootDirectory,
    );
  }

  protected perform getPackageOrganizationDataSource(): ?PackageOrganizationDataSource
  {
    return new PackageOrganizationDataSource(
      $this->rootDirectory,
      $this->upstreamRelativeRootPath,
      $this->includeUpstreamPackages,
    );
  }
}

Subsequent, we should always invoke the monorepo-builder with both config file, by offering the --config possibility:

jobs:
  provide_data:
    steps:
      - id: output_data
        title: Calculate matrix for packages
        run: |
          echo "::set-output title=matrix::$(vendor/bin/monorepo-builder package-entries-json --config=monorepo-builder-without-upstream-packages.php)"

Nonetheless, as we noticed earlier on, we need to maintain the GitHub Actions workflows within the upstream monorepo as the one supply of fact, and so they clearly don’t want these adjustments.

The answer I discovered to this difficulty is to offer a --config possibility within the upstream repo all the time, with every command getting its personal config file, such because the validate command receiving the validate.php config file:

  - title: Run validation
    run: vendor/bin/monorepo-builder validate --config=config/monorepo-builder/validate.php

Now, there are not any config information within the upstream monorepo, because it doesn’t want them. Nevertheless it is not going to break, as a result of the Monorepo builder checks if the config file exists and, if it doesn’t, it masses the default config file as an alternative. So we’ll both override the config, or nothing occurs.

The downstream repo does present the config information for every command, specifying if so as to add the upstream packages or not:

Btw, as a aspect observe, that is one other instance of how the multi-monorepo leaks.

// File config/monorepo-builder/validate.php
return require_once __DIR__ . '/monorepo-builder-with-upstream-packages.php';

Overriding The Configuration

We’re nearly performed. By now the downstream monorepo can override the configuration from the upstream monorepo. So all that’s left to do is to offer the brand new configuration.

In school PluginDataSource I override the configuration of which WordPress plugins should be generated, offering the PRO ones as an alternative:

namespace PoPGraphQLAPIPROConfigSymplifyMonorepoBuilderDataSources;

use PoPPoPConfigSymplifyMonorepoBuilderDataSourcesPluginDataSource as UpstreamPluginDataSource;

class PluginDataSource extends UpstreamPluginDataSource
{
  public perform getPluginConfigEntries(): array
  {
    return [
      // GraphQL API PRO
      [
        'path' => 'layers/GraphQLAPIForWP/plugins/graphql-api-pro',
        'zip_file' => 'graphql-api-pro.zip',
        'main_file' => 'graphql-api-pro.php',
        'dist_repo_organization' => 'GraphQLAPI-PRO',
        'dist_repo_name' => 'graphql-api-pro-dist',
      ],
      // GraphQL API Extensions
      // Google Translate
      [
        'path' => 'layers/GraphQLAPIForWP/plugins/google-translate',
        'zip_file' => 'graphql-api-google-translate.zip',
        'main_file' => 'graphql-api-google-translate.php',
        'dist_repo_organization' => 'GraphQLAPI-PRO',
        'dist_repo_name' => 'graphql-api-google-translate-dist',
      ],
      // Occasions Supervisor
      [
        'path' => 'layers/GraphQLAPIForWP/plugins/events-manager',
        'zip_file' => 'graphql-api-events-manager.zip',
        'main_file' => 'graphql-api-events-manager.php',
        'dist_repo_organization' => 'GraphQLAPI-PRO',
        'dist_repo_name' => 'graphql-api-events-manager-dist',
      ],
    ];
  }
}

Creating a brand new launch on GitHub will set off the generate_plugins.yml workflow and generate the PRO plugins on my non-public monorepo:

Generating PRO plugins

Producing PRO plugins. (Giant preview)

Tadaaaaaaaa! 🎉

Conclusion

As all the time, there isn’t any “greatest” resolution, solely options which will work higher relying on the context. The multi-monorepo strategy will not be appropriate to each sort of mission or group. I consider the most important beneficiaries are plugin creators who launch public plugins to be upgraded to their PRO variations, and companies customizing plugins for his or her shoppers.

In my case, I’m fairly pleased with this strategy. It takes a little bit of effort and time to get proper, nevertheless it’s a one-off funding. As soon as the set-up is over, I can simply concentrate on constructing my PRO plugins, and the time financial savings regarding mission administration may be large.

Smashing Editorial
(yk)

Supply hyperlink

Leave a Reply