Creating A Public/Personal Multi-Monorepo For PHP Tasks

No Comments

To make the event expertise quicker, I moved all of the PHP packages required by my tasks to a monorepo. When every bundle is hosted by itself repo (the “multirepo” strategy), it’d want be developed and examined by itself, after which printed to Packagist earlier than I might set up it on different packages through Composer. With the monorepo, as a result of all packages are hosted collectively, these could be developed, examined, versioned and launched on the similar time.

The monorepo internet hosting my PHP packages is public, accessible to anybody on GitHub. Git repos can’t grant completely different entry to completely different property, it is all both public or personal. As I plan to launch a PRO WordPress plugin, I need its packages to be saved personal, which means they cannot be added to the general public monorepo.

The answer I discovered is to make use of a “multi-monorepo” strategy, comprising two monorepos: one public and one personal, with the personal monorepo embedding the general public one as a Git submodule, permitting it to entry its recordsdata. The general public monorepo could be thought-about the “upstream”, and the personal monorepo the “downstream”.

As my saved iterating on my code, the repo set-up I wanted to make use of at every stage of my undertaking additionally wanted to be upgraded. Therefore, I did not arrive on the multi-monorepo strategy on day 1, however it was a course of that spanned a number of years and took its truthful quantity of effort, going from a single repo, to a number of repos, to the monorepo, to lastly the multi-monorepo.

On this article I’ll describe how I set-up my multi-monorepo utilizing the Monorepo builder, which works for PHP tasks based mostly on Composer.

Reusing Code In The Multi-Monorepo

The general public monorepo leoloso/PoP is the place I maintain all my PHP tasks.

This monorepo comprises workflow generate_plugins.yml, which generates a number of WordPress plugins for distribution when creating a brand new launch on GitHub:

The workflow configuration is just not hard-coded throughout the YAML however injected through PHP code:

– id: output_data
run: |
echo “::set-output identify=plugin_config_entries::$(vendor/bin/monorepo-builder plugin-config-entries-json)”

And the configuration is offered through a customized PHP class:

class PluginDataSource
{
public perform getPluginConfigEntries(): array
{
return [
// GraphQL API for WordPress
[
‘path’ => ‘layers/GraphQLAPIForWP/plugins/graphql-api-for-wp’,
‘zip_file’ => ‘graphql-api.zip’,
‘main_file’ => ‘graphql-api.php’,
‘dist_repo_organization’ => ‘GraphQLAPI’,
‘dist_repo_name’ => ‘graphql-api-for-wp-dist’,
],
// GraphQL API – Extension Demo
[
‘path’ => ‘layers/GraphQLAPIForWP/plugins/extension-demo’,
‘zip_file’ => ‘graphql-api-extension-demo.zip’,
‘main_file’ => ‘graphql-api-extension-demo.php’,
‘dist_repo_organization’ => ‘GraphQLAPI’,
‘dist_repo_name’ => ‘extension-demo-dist’,
],
];
}
}

Producing a number of WordPress plugins all collectively, and configuring the workflow through PHP, has decreased the period of time wanted managing the undertaking. The workflow at present handles two plugins (the GraphQL API and its extension demo), however it might deal with 200 with out extra effort on my facet.

It’s this set-up that I wish to reuse for my personal monorepo leoloso/GraphQLAPI-PRO, in order that the PRO plugins can be generated with out effort.

The code to reuse will comprise:

The GitHub Actions workflows to generate the WordPress plugins (together with scoping, downgrading from PHP 8.0 to 7.1 and importing to the releases web page).
The customized PHP companies to configure the workflows.

The personal monorepo can then generate the PRO WordPress plugins, just by triggering the workflows from the general public monorepo, and overriding their configuration in PHP.

Linking Monorepos Through Git Submodules

To embed the general public repo throughout the personal one we use Git submodules:

git submodule add <public repo URL>

I embedded the general public repo beneath subfolder submodules of the personal monorepo, permitting me so as to add extra upstream monorepos sooner or later if wanted. In GitHub, the folder shows the submodule’s particular commit, and clicking on it’s going to take me to that commit on leoloso/PoP:

Because it comprises submodules, to clone the personal repo we should present the –recursive possibility:

git clone –recursive <personal repo URL>

Reusing The GitHub Actions Workflows

GitHub Actions solely hundreds workflows from beneath .github/workflows. As a result of the general public workflows within the downstream monorepo are are beneath submodules/PoP/.github/workflows, these should be duplicated into the anticipated location.

With a view to maintain the upstream workflows as the one supply of reality, we will restrict ourselves to copying the recordsdata to downstream beneath .github/workflows, however by no means edit them there. If there’s any change to be performed, it should be performed within the upstream monorepo, after which copied over.

As a facet be aware, discover how which means that the multi-monorepo leaks: the upstream monorepo is just not absolutely autonomous, and can must be tailored to swimsuit the downstream monorepo.

In my first iteration to repeat the workflows, I created a easy Composer script:

{
“scripts”: {
“copy-workflows”: [
“php -r “copy(‘submodules/PoP/.github/workflows/generate_plugins.yml’, ‘.github/workflows/generate_plugins.yml’);””,
“php -r “copy(‘submodules/PoP/.github/workflows/split_monorepo.yaml’, ‘.github/workflows/split_monorepo.yaml’);””
]
}
}

Then, after modifying the workflows within the upstream monorepo, I might copy them to downstream by executing:

composer copy-workflows

However then I spotted that simply copying the workflows is just not sufficient: they need to even be modified within the course of. That is so as a result of trying out the downstream monorepo requires possibility –recurse-submodules, as to additionally checkout the submodules.

In GitHub Actions, the checkout for downstream is finished like this:

– makes use of: actions/checkout@v2
with:
submodules: recursive

So trying out the downstream repo wants enter submodules: recursive, however the upstream one doesn’t, they usually each use the identical supply file.

The answer I discovered is to offer the worth for enter submodules through an setting variable CHECKOUT_SUBMODULES, which is by default empty for the upstream repo:

env:
CHECKOUT_SUBMODULES: “”

jobs:
provide_data:
steps:
– makes use of: actions/checkout@v2
with:
submodules: ${{ env.CHECKOUT_SUBMODULES }}

Then, when copying the workflows from upstream to downstream, the worth of CHECKOUT_SUBMODULES is changed with “recursive”:

env:
CHECKOUT_SUBMODULES: “recursive”

When modifying the workflow, it is a good suggestion to make use of a regex, in order that it really works for various codecs within the supply file (comparable to CHECKOUT_SUBMODULES: “” or CHECKOUT_SUBMODULES:” or CHECKOUT_SUBMODULES:) as to not create bugs from this type of assumed-to-be-harmless adjustments.

Then, the copy-workflows Composer script seen above is just not ok to deal with this complexity.

In my subsequent iteration, I created a PHP command CopyUpstreamMonorepoFilesCommand, to be executed through the Monorepo builder:

vendor/bin/monorepo-builder copy-upstream-monorepo-files

This command makes use of a customized service FileCopierSystem to repeat all recordsdata from a supply folder to the indicated vacation spot, whereas optionally changing their contents:

namespace PoPGraphQLAPIPROExtensionsSymplifyMonorepoBuilderSmartFile;

use NetteUtilsStrings;
use SymplifySmartFileSystemFinderSmartFinder;
use SymplifySmartFileSystemSmartFileSystem;

closing class FileCopierSystem
{
public perform __construct(
personal SmartFileSystem $smartFileSystem,
personal SmartFinder $smartFinder,
) {
}

/**
* @param array $patternReplacements a regex sample to go looking, and its alternative
*/
public perform copyFilesFromFolder(
string $fromFolder,
string $toFolder,
array $patternReplacements = []
): void {
$smartFileInfos = $this->smartFinder->discover([$fromFolder], ‘*’);

foreach ($smartFileInfos as $smartFileInfo) {
$fromFile = $smartFileInfo->getRealPath();
$fileContent = $this->smartFileSystem->readFile($fromFile);

foreach ($patternReplacements as $sample => $alternative) {
$fileContent = Strings::substitute($fileContent, $sample, $alternative);
}

$toFile = $toFolder . substr($fromFile, strlen($fromFolder));
$this->smartFileSystem->dumpFile($toFile, $fileContent);
}
}
}

When invoking this methodology to repeat all workflows downstream, I additionally substitute the worth of CHECKOUT_SUBMODULES:

/**
* Copy all workflows to `.github/`, and convert:
* `CHECKOUT_SUBMODULES: “”`
* into:
* `CHECKOUT_SUBMODULES: “recursive”`
*/
$regexReplacements = [
‘#CHECKOUT_SUBMODULES:(s+”.*”)?#’ => ‘CHECKOUT_SUBMODULES: “recursive”‘,
];
(new FileCopierSystem())->copyFilesFromFolder(
‘submodules/PoP/.github/workflows’,
‘.github/workflows’,
$regexReplacements
);

Workflow generate_plugins.yml wants an extra alternative. When the WordPress plugin is generated, its code is downgraded from PHP 8.0 to 7.1 by invoking script ci/downgrade/downgrade_code.sh:

– identify: Downgrade code for manufacturing (to PHP 7.1)
run: ci/downgrade/downgrade_code.sh “${{ matrix.pluginConfig.rector_downgrade_config }}” “” “${{ matrix.pluginConfig.path }}” “${{ matrix.pluginConfig.additional_rector_configs }}”

Within the downstream monorepo, this file can be situated beneath submodules/PoP/ci/downgrade/downgrade_code.sh. Then, we have now the downstream workflow level to the appropriate path with this alternative:

$regexReplacements = [
// …
‘#(ci/downgrade/downgrade_code.sh)#’ => ‘submodules/PoP/$1’,
];

Configuring Packages In Monorepo Builder

File monorepo-builder.php — positioned on the root of the monorepo — holds the configuration for the Monorepo builder. In it we should point out the place the packages (and plugins, shoppers, or the rest) are situated:

use SymfonyComponentDependencyInjectionLoaderConfiguratorContainerConfigurator;
use SymplifyMonorepoBuilderValueObjectOption;

return static perform (ContainerConfigurator $containerConfigurator): void {
$parameters = $containerConfigurator->parameters();
$parameters->set(Choice::PACKAGE_DIRECTORIES, [
__DIR__ . ‘/packages’,
__DIR__ . ‘/plugins’,
]);
};

The personal monorepo will need to have entry to all code: its personal packages, plus these from the general public monorepo. Then, it should outline all packages from each monorepos within the config file. Those from the general public monorepo are situated beneath “/submodules/PoP”:

return static perform (ContainerConfigurator $containerConfigurator): void {
$parameters = $containerConfigurator->parameters();
$parameters->set(Choice::PACKAGE_DIRECTORIES, [
// public code
__DIR__ . ‘/submodules/PoP/packages’,
__DIR__ . ‘/submodules/PoP/plugins’,
// private code
__DIR__ . ‘/packages’,
__DIR__ . ‘/plugins’,
__DIR__ . ‘/clients’,
]);
};

As it may be seen, the configuration for upstream and downstream are just about the identical, with the distinction that the downstream one will:

Change the trail to the general public packages.
Add the personal packages.

Then, it is smart to rewrite the configuration utilizing object-oriented programming, in order that we make code DRY (do not repeat your self) by having a PHP class within the public repo be prolonged within the personal repo.

Recreating The Configuration Through OOP

Let’s refactor the configuration. Within the public repo, file monorepo-builder.php will merely reference a brand new class ContainerConfigurationService the place all motion will occur:

use PoPPoPConfigSymplifyMonorepoBuilderConfiguratorsContainerConfigurationService;
use SymfonyComponentDependencyInjectionLoaderConfiguratorContainerConfigurator;

return static perform (ContainerConfigurator $containerConfigurator): void {
$containerConfigurationService = new ContainerConfigurationService(
$containerConfigurator,
__DIR__
);
$containerConfigurationService->configureContainer();
};

The __DIR__ param factors to the foundation of the monorepo. It is going to be wanted to acquire the total path to the bundle directories.

Class ContainerConfigurationService is now answerable for producing the configuration:

namespace PoPPoPConfigSymplifyMonorepoBuilderConfigurators;

use PoPPoPConfigSymplifyMonorepoBuilderDataSourcesPackageOrganizationDataSource;
use SymfonyComponentDependencyInjectionLoaderConfiguratorContainerConfigurator;
use SymplifyMonorepoBuilderValueObjectOption;

class ContainerConfigurationService
{
public perform __construct(
protected ContainerConfigurator $containerConfigurator,
protected string $rootDirectory,
) {
}

public perform configureContainer(): void
{
$parameters = $this->containerConfigurator->parameters();
if ($packageOrganizationConfig = $this->getPackageOrganizationDataSource($this->rootDirectory)) {
$parameters->set(
Choice::PACKAGE_DIRECTORIES,
$packageOrganizationConfig->getPackageDirectories()
);
}
}

protected perform getPackageOrganizationDataSource(): ?PackageOrganizationDataSource
{
return new PackageOrganizationDataSource($this->rootDirectory);
}
}

The configuration could be break up throughout a number of courses. On this case, ContainerConfigurationService retrieves the bundle configuration via class PackageOrganizationDataSource, which has this implementation:

namespace PoPPoPConfigSymplifyMonorepoBuilderDataSources;

class PackageOrganizationDataSource
{
public perform __construct(protected string $rootDir)
{
}

public perform getPackageDirectories(): array
{
return array_map(
fn (string $packagePath) => $this->rootDir . ‘/’ . $packagePath,
$this->getRelativePackagePaths()
);
}

public perform getRelativePackagePaths(): array
{
return [
‘packages’,
‘plugins’,
];
}
}

Overriding The Configuration In The Downstream Monorepo

Now that the configuration within the public monorepo is setup through OOP, we will prolong it to swimsuit the wants of the personal monorepo.

With a view to permit the personal monorepo to autoload the PHP code from the general public monorepo, we should first configure the downstream composer.json to reference the supply code from the upstream, which is beneath path submodules/PoP/src:

{
“autoload”: {
“psr-4”: {
“PoPGraphQLAPIPRO”: “src”,
“PoPPoP”: “submodules/PoP/src”
}
}
}

Under is file monorepo-builder.php for the personal monorepo. Discover that the referenced class ContainerConfigurationService within the upstream repo belongs to the PoPPoP namespace, however now it switched to the PoPGraphQLAPIPRO namespace. This class should obtain the extra enter $upstreamRelativeRootPath (with worth “submodules/PoP”) as to recreate the total path to the general public packages:

use PoPGraphQLAPIPROConfigSymplifyMonorepoBuilderConfiguratorsContainerConfigurationService;
use SymfonyComponentDependencyInjectionLoaderConfiguratorContainerConfigurator;

return static perform (ContainerConfigurator $containerConfigurator): void {
$containerConfigurationService = new ContainerConfigurationService(
$containerConfigurator,
__DIR__,
‘submodules/PoP’
);
$containerConfigurationService->configureContainer();
};

The downstream class ContainerConfigurationService overrides which PackageOrganizationDataSource class is used within the configuration:

namespace PoPGraphQLAPIPROConfigSymplifyMonorepoBuilderConfigurators;

use PoPPoPConfigSymplifyMonorepoBuilderConfiguratorsContainerConfigurationService as UpstreamContainerConfigurationService;
use PoPGraphQLAPIPROConfigSymplifyMonorepoBuilderDataSourcesPackageOrganizationDataSource;
use SymfonyComponentDependencyInjectionLoaderConfiguratorContainerConfigurator;

class ContainerConfigurationService extends UpstreamContainerConfigurationService
{
public perform __construct(
ContainerConfigurator $containerConfigurator,
string $rootDirectory,
protected string $upstreamRelativeRootPath
) {
mother or father::__construct(
$containerConfigurator,
$rootDirectory
);
}

protected perform getPackageOrganizationDataSource(): ?PackageOrganizationDataSource
{
return new PackageOrganizationDataSource(
$this->rootDirectory,
$this->upstreamRelativeRootPath
);
}
}

Lastly, downstream class PackageOrganizationDataSource comprises the total path to each private and non-private packages:

namespace PoPGraphQLAPIPROConfigSymplifyMonorepoBuilderDataSources;

use PoPPoPConfigSymplifyMonorepoBuilderDataSourcesPackageOrganizationDataSource as UpstreamPackageOrganizationDataSource;

class PackageOrganizationDataSource extends UpstreamPackageOrganizationDataSource
{
public perform __construct(
string $rootDir,
protected string $upstreamRelativeRootPath
) {
mother or father::__construct($rootDir);
}

public perform getRelativePackagePaths(): array
{
return array_merge(
// Public packages – Prepend them with “submodules/PoP/”
array_map(
fn ($upstreamPackagePath) => $this->upstreamRelativeRootPath . ‘/’ . $upstreamPackagePath,
mother or father::getRelativePackagePaths()
),
// Personal packages
[
‘packages’,
‘plugins’,
‘clients’,
]
);
}
}

Injecting The Configuration From PHP Into GitHub Actions

Monorepo builder provides command packages-json, which we will use to inject the bundle paths into the GitHub Actions workflow:

jobs:
provide_data:
steps:
– id: output_data
identify: Calculate matrix for packages
run: |
echo “::set-output identify=matrix::$(vendor/bin/monorepo-builder packages-json)”

outputs:
matrix: ${{ steps.output_data.outputs.matrix }}

This command produces a stringified JSON. Within the workflow it should be transformed to a JSON object through fromJson:

jobs:
split_monorepo:
wants: provide_data
technique:
matrix:
bundle: ${{ fromJson(wants.provide_data.outputs.matrix) }}

Sadly, command packages-json outputs the bundle names however not their paths, which works when all packages are beneath the identical folder (comparable to packages/). It would not work in our case, since private and non-private packages are situated in numerous folders.

Happily, the Monorepo builder could be prolonged with customized PHP companies. So I created a customized command package-entries-json (through class PackageEntriesJsonCommand) which does output the trail to the bundle.

The workflow was then up to date with the brand new command:

run: |
echo “::set-output identify=matrix::$(vendor/bin/monorepo-builder package-entries-json)”

Executed on the general public monorepo, it produces the next packages (amongst many others):

[
{
“name”: “graphql-api-for-wp”,
“path”: “layers/GraphQLAPIForWP/plugins/graphql-api-for-wp”
},
{
“name”: “extension-demo”,
“path”: “layers/GraphQLAPIForWP/plugins/extension-demo”
},
{
“name”: “access-control”,
“path”: “layers/Engine/packages/access-control”
},
{
“name”: “api”,
“path”: “layers/API/packages/api”
},
{
“name”: “api-clients”,
“path”: “layers/API/packages/api-clients”
}
]

Executed on the personal monorepo, it produces the next entries (amongst many others):

[
{
“name”: “graphql-api-for-wp”,
“path”: “submodules/PoP/layers/GraphQLAPIForWP/plugins/graphql-api-for-wp”
},
{
“name”: “extension-demo”,
“path”: “submodules/PoP/layers/GraphQLAPIForWP/plugins/extension-demo”
},
{
“name”: “access-control”,
“path”: “submodules/PoP/layers/Engine/packages/access-control”
},
{
“name”: “api”,
“path”: “submodules/PoP/layers/API/packages/api”
},
{
“name”: “api-clients”,
“path”: “submodules/PoP/layers/API/packages/api-clients”
},
{
“name”: “graphql-api-pro”,
“path”: “layers/GraphQLAPIForWP/plugins/graphql-api-pro”
},
{
“name”: “convert-case-directives”,
“path”: “layers/Schema/packages/convert-case-directives”
},
{
“name”: “export-directive”,
“path”: “layers/GraphQLByPoP/packages/export-directive”
}
]

As it may be appreciated, it really works nicely: the configuration for the downstream monorepo comprises each private and non-private packages, and the paths to the general public ones had been prepended with “submodules/PoP”.

Skipping Public Packages In The Downstream Monorepo

To date, the downstream monorepo has included each private and non-private packages in its configuration. Nevertheless, not each command must be executed on the general public packages.

Take static evaluation, as an illustration. The general public monorepo already executes PHPStan on all public packages through workflow phpstan.yml, as proven in this run. If the downstream monorepo runs as soon as once more PHPStan on the general public packages, it’s a waste of computing time. Then, the phpstan.yml workflow must run on the personal packages solely.

That implies that relying on the command to execute within the downstream repo, we might wish to both embody each private and non-private packages, or solely personal ones.

So as to add public packages or not on the downstream configuration, we adapt downstream class PackageOrganizationDataSource to verify this situation through enter $includeUpstreamPackages:

namespace PoPGraphQLAPIPROConfigSymplifyMonorepoBuilderDataSources;

use PoPPoPConfigSymplifyMonorepoBuilderDataSourcesPackageOrganizationDataSource as UpstreamPackageOrganizationDataSource;

class PackageOrganizationDataSource extends UpstreamPackageOrganizationDataSource
{
public perform __construct(
string $rootDir,
protected string $upstreamRelativeRootPath,
protected bool $includeUpstreamPackages
) {
mother or father::__construct($rootDir);
}

public perform getRelativePackagePaths(): array
{
return array_merge(
// Add the general public packages?
$this->includeUpstreamPackages ?
// Public packages – Prepend them with “submodules/PoP/”
array_map(
fn ($upstreamPackagePath) => $this->upstreamRelativeRootPath . ‘/’ . $upstreamPackagePath,
mother or father::getRelativePackagePaths()
) : [],
// Personal packages
[
‘packages’,
‘plugins’,
‘clients’,
]
);
}
}

Subsequent, we have to present worth $includeUpstreamPackages as both true or false relying on the command to execute.

We are able to do that by changing config file monorepo-builder.php with two different config recordsdata: monorepo-builder-with-upstream-packages.php (which passes $includeUpstreamPackages => true) and monorepo-builder-without-upstream-packages.php (which passes $includeUpstreamPackages => false):

// File monorepo-builder-without-upstream-packages.php
use PoPGraphQLAPIPROConfigSymplifyMonorepoBuilderConfiguratorsContainerConfigurationService;
use SymfonyComponentDependencyInjectionLoaderConfiguratorContainerConfigurator;

return static perform (ContainerConfigurator $containerConfigurator): void {
$containerConfigurationService = new ContainerConfigurationService(
$containerConfigurator,
__DIR__,
‘submodules/PoP’,
false, // That is $includeUpstreamPackages
);
$containerConfigurationService->configureContainer();
};

We then replace ContainerConfigurationService to obtain parameter $includeUpstreamPackages and move it alongside to PackageOrganizationDataSource:

namespace PoPGraphQLAPIPROConfigSymplifyMonorepoBuilderConfigurators;

use PoPPoPConfigSymplifyMonorepoBuilderConfiguratorsContainerConfigurationService as UpstreamContainerConfigurationService;
use PoPGraphQLAPIPROConfigSymplifyMonorepoBuilderDataSourcesPackageOrganizationDataSource;
use SymfonyComponentDependencyInjectionLoaderConfiguratorContainerConfigurator;

class ContainerConfigurationService extends UpstreamContainerConfigurationService
{
public perform __construct(
ContainerConfigurator $containerConfigurator,
string $rootDirectory,
protected string $upstreamRelativeRootPath,
protected bool $includeUpstreamPackages,
) {
mother or father::__construct(
$containerConfigurator,
$rootDirectory,
);
}

protected perform getPackageOrganizationDataSource(): ?PackageOrganizationDataSource
{
return new PackageOrganizationDataSource(
$this->rootDirectory,
$this->upstreamRelativeRootPath,
$this->includeUpstreamPackages,
);
}
}

Subsequent, we must always invoke the monorepo-builder with both config file, by offering the –config possibility:

jobs:
provide_data:
steps:
– id: output_data
identify: Calculate matrix for packages
run: |
echo “::set-output identify=matrix::$(vendor/bin/monorepo-builder package-entries-json –config=monorepo-builder-without-upstream-packages.php)”

Nevertheless, as we noticed earlier on, we wish to maintain the GitHub Actions workflows within the upstream monorepo as the one supply of reality, they usually clearly don’t want these adjustments.

The answer I discovered to this situation is to offer a –config possibility within the upstream repo all the time, with every command getting its personal config file, such because the validate command receiving the validate.php config file:

– identify: Run validation
run: vendor/bin/monorepo-builder validate –config=config/monorepo-builder/validate.php

Now, there aren’t any config recordsdata within the upstream monorepo, because it would not want them. Nevertheless it is not going to break, as a result of the Monorepo builder checks if the config file exists and, if it doesn’t, it hundreds the default config file as an alternative. So we are going to both override the config, or nothing occurs.

The downstream repo does present the config recordsdata for every command, specifying if so as to add the upstream packages or not:

Btw, as a facet be aware, that is one other instance of how the multi-monorepo leaks.

// File config/monorepo-builder/validate.php
return require_once __DIR__ . ‘/monorepo-builder-with-upstream-packages.php’;

Overriding The Configuration

We’re virtually performed. By now the downstream monorepo can override the configuration from the upstream monorepo. So all that is left to do is to offer the brand new configuration.

At school PluginDataSource I override the configuration of which WordPress plugins should be generated, offering the PRO ones as an alternative:

namespace PoPGraphQLAPIPROConfigSymplifyMonorepoBuilderDataSources;

use PoPPoPConfigSymplifyMonorepoBuilderDataSourcesPluginDataSource as UpstreamPluginDataSource;

class PluginDataSource extends UpstreamPluginDataSource
{
public perform getPluginConfigEntries(): array
{
return [
// GraphQL API PRO
[
‘path’ => ‘layers/GraphQLAPIForWP/plugins/graphql-api-pro’,
‘zip_file’ => ‘graphql-api-pro.zip’,
‘main_file’ => ‘graphql-api-pro.php’,
‘dist_repo_organization’ => ‘GraphQLAPI-PRO’,
‘dist_repo_name’ => ‘graphql-api-pro-dist’,
],
// GraphQL API Extensions
// Google Translate
[
‘path’ => ‘layers/GraphQLAPIForWP/plugins/google-translate’,
‘zip_file’ => ‘graphql-api-google-translate.zip’,
‘main_file’ => ‘graphql-api-google-translate.php’,
‘dist_repo_organization’ => ‘GraphQLAPI-PRO’,
‘dist_repo_name’ => ‘graphql-api-google-translate-dist’,
],
// Occasions Supervisor
[
‘path’ => ‘layers/GraphQLAPIForWP/plugins/events-manager’,
‘zip_file’ => ‘graphql-api-events-manager.zip’,
‘main_file’ => ‘graphql-api-events-manager.php’,
‘dist_repo_organization’ => ‘GraphQLAPI-PRO’,
‘dist_repo_name’ => ‘graphql-api-events-manager-dist’,
],
];
}
}

Creating a brand new launch on GitHub will set off the generate_plugins.yml workflow and generate the PRO plugins on my personal monorepo:

Tadaaaaaaaa! 🎉

Conclusion

As all the time, there isn’t a “greatest” answer, solely options that will work higher relying on the context. The multi-monorepo strategy is just not appropriate to each sort of undertaking or workforce. I consider the most important beneficiaries are plugin creators who launch public plugins to be upgraded to their PRO variations, and businesses customizing plugins for his or her shoppers.

In my case, I am fairly pleased with this strategy. It takes a little bit of effort and time to get proper, however it’s a one-off funding. As soon as the set-up is over, I can simply give attention to constructing my PRO plugins, and the time financial savings regarding undertaking administration could be enormous.

    About Marketing Solution Australia

    We are a digital marketing company with a focus on helping our customers achieve great results across several key areas.

    Request a free quote

    We offer professional SEO services that help websites increase their organic search score drastically in order to compete for the highest rankings even when it comes to highly competitive keywords.

    Subscribe to our newsletter!

    More from our blog

    See all posts

    Leave a Comment