To make the event expertise quicker, I moved all of the PHP packages required by my initiatives to a monorepo. When every bundle is hosted by itself repo (the “multirepo” method), it’d want be developed and examined by itself, after which printed to Packagist earlier than I might set up it on different packages by way of Composer. With the monorepo, as a result of all packages are hosted collectively, these may be developed, examined, versioned and launched on the identical time.
The monorepo internet hosting my PHP packages is public, accessible to anybody on GitHub. Git repos can’t grant completely different entry to completely different property, it is all both public or personal. As I plan to launch a PRO WordPress plugin, I would like its packages to be saved personal, which means they cannot be added to the general public monorepo.
The answer I discovered is to make use of a “multi-monorepo” method, comprising two monorepos: one public and one personal, with the personal monorepo embedding the general public one as a Git submodule, permitting it to entry its recordsdata. The general public monorepo may be thought-about the “upstream”, and the personal monorepo the “downstream”.
As my saved iterating on my code, the repo set-up I wanted to make use of at every stage of my venture additionally wanted to be upgraded. Therefore, I did not arrive on the multi-monorepo method on day 1, however it was a course of that spanned a number of years and took its truthful quantity of effort, going from a single repo, to a number of repos, to the monorepo, to lastly the multi-monorepo.
On this article I’ll describe how I set-up my multi-monorepo utilizing the Monorepo builder, which works for PHP initiatives based mostly on Composer.
Reusing Code In The Multi-Monorepo
The general public monorepo leoloso/PoP is the place I hold all my PHP initiatives.
This monorepo comprises workflow generate_plugins.yml, which generates a number of WordPress plugins for distribution when creating a brand new launch on GitHub:
The workflow configuration just isn’t hard-coded throughout the YAML however injected by way of PHP code:
– id: output_data
run: |
echo “::set-output title=plugin_config_entries::$(vendor/bin/monorepo-builder plugin-config-entries-json)”
And the configuration is supplied by way of a customized PHP class:
class PluginDataSource
{
public perform getPluginConfigEntries(): array
{
return [
// GraphQL API for WordPress
[
‘path’ => ‘layers/GraphQLAPIForWP/plugins/graphql-api-for-wp’,
‘zip_file’ => ‘graphql-api.zip’,
‘main_file’ => ‘graphql-api.php’,
‘dist_repo_organization’ => ‘GraphQLAPI’,
‘dist_repo_name’ => ‘graphql-api-for-wp-dist’,
],
// GraphQL API – Extension Demo
[
‘path’ => ‘layers/GraphQLAPIForWP/plugins/extension-demo’,
‘zip_file’ => ‘graphql-api-extension-demo.zip’,
‘main_file’ => ‘graphql-api-extension-demo.php’,
‘dist_repo_organization’ => ‘GraphQLAPI’,
‘dist_repo_name’ => ‘extension-demo-dist’,
],
];
}
}
Producing a number of WordPress plugins all collectively, and configuring the workflow by way of PHP, has decreased the period of time wanted managing the venture. The workflow at the moment handles two plugins (the GraphQL API and its extension demo), however it might deal with 200 with out further effort on my aspect.
It’s this set-up that I wish to reuse for my personal monorepo leoloso/GraphQLAPI-PRO, in order that the PRO plugins may also be generated with out effort.
The code to reuse will comprise:
The GitHub Actions workflows to generate the WordPress plugins (together with scoping, downgrading from PHP 8.0 to 7.1 and importing to the releases web page).
The customized PHP companies to configure the workflows.
The personal monorepo can then generate the PRO WordPress plugins, just by triggering the workflows from the general public monorepo, and overriding their configuration in PHP.
Linking Monorepos Through Git Submodules
To embed the general public repo throughout the personal one we use Git submodules:
git submodule add <public repo URL>
I embedded the general public repo beneath subfolder submodules of the personal monorepo, permitting me so as to add extra upstream monorepos sooner or later if wanted. In GitHub, the folder shows the submodule’s particular commit, and clicking on it can take me to that commit on leoloso/PoP:
Because it comprises submodules, to clone the personal repo we should present the –recursive choice:
git clone –recursive <personal repo URL>
Reusing The GitHub Actions Workflows
GitHub Actions solely hundreds workflows from beneath .github/workflows. As a result of the general public workflows within the downstream monorepo are are beneath submodules/PoP/.github/workflows, these have to be duplicated into the anticipated location.
As a way to hold the upstream workflows as the one supply of fact, we are able to restrict ourselves to copying the recordsdata to downstream beneath .github/workflows, however by no means edit them there. If there may be any change to be finished, it have to be finished within the upstream monorepo, after which copied over.
As a aspect word, discover how which means that the multi-monorepo leaks: the upstream monorepo just isn’t absolutely autonomous, and can have to be tailored to go well with the downstream monorepo.
In my first iteration to repeat the workflows, I created a easy Composer script:
{
“scripts”: {
“copy-workflows”: [
“php -r “copy(‘submodules/PoP/.github/workflows/generate_plugins.yml’, ‘.github/workflows/generate_plugins.yml’);””,
“php -r “copy(‘submodules/PoP/.github/workflows/split_monorepo.yaml’, ‘.github/workflows/split_monorepo.yaml’);””
]
}
}
Then, after modifying the workflows within the upstream monorepo, I might copy them to downstream by executing:
composer copy-workflows
However then I spotted that simply copying the workflows just isn’t sufficient: they need to even be modified within the course of. That is so as a result of testing the downstream monorepo requires choice –recurse-submodules, as to additionally checkout the submodules.
In GitHub Actions, the checkout for downstream is finished like this:
– makes use of: actions/checkout@v2
with:
submodules: recursive
So testing the downstream repo wants enter submodules: recursive, however the upstream one doesn’t, they usually each use the identical supply file.
The answer I discovered is to supply the worth for enter submodules by way of an setting variable CHECKOUT_SUBMODULES, which is by default empty for the upstream repo:
env:
CHECKOUT_SUBMODULES: “”
jobs:
provide_data:
steps:
– makes use of: actions/checkout@v2
with:
submodules: ${{ env.CHECKOUT_SUBMODULES }}
Then, when copying the workflows from upstream to downstream, the worth of CHECKOUT_SUBMODULES is changed with “recursive”:
env:
CHECKOUT_SUBMODULES: “recursive”
When modifying the workflow, it is a good suggestion to make use of a regex, in order that it really works for various codecs within the supply file (equivalent to CHECKOUT_SUBMODULES: “” or CHECKOUT_SUBMODULES:” or CHECKOUT_SUBMODULES:) as to not create bugs from this type of assumed-to-be-harmless modifications.
Then, the copy-workflows Composer script seen above just isn’t adequate to deal with this complexity.
In my subsequent iteration, I created a PHP command CopyUpstreamMonorepoFilesCommand, to be executed by way of the Monorepo builder:
vendor/bin/monorepo-builder copy-upstream-monorepo-files
This command makes use of a customized service FileCopierSystem to repeat all recordsdata from a supply folder to the indicated vacation spot, whereas optionally changing their contents:
namespace PoPGraphQLAPIPROExtensionsSymplifyMonorepoBuilderSmartFile;
use NetteUtilsStrings;
use SymplifySmartFileSystemFinderSmartFinder;
use SymplifySmartFileSystemSmartFileSystem;
closing class FileCopierSystem
{
public perform __construct(
personal SmartFileSystem $smartFileSystem,
personal SmartFinder $smartFinder,
) {
}
/**
* @param array $patternReplacements a regex sample to go looking, and its substitute
*/
public perform copyFilesFromFolder(
string $fromFolder,
string $toFolder,
array $patternReplacements = []
): void {
$smartFileInfos = $this->smartFinder->discover([$fromFolder], ‘*’);
foreach ($smartFileInfos as $smartFileInfo) {
$fromFile = $smartFileInfo->getRealPath();
$fileContent = $this->smartFileSystem->readFile($fromFile);
foreach ($patternReplacements as $sample => $substitute) {
$fileContent = Strings::substitute($fileContent, $sample, $substitute);
}
$toFile = $toFolder . substr($fromFile, strlen($fromFolder));
$this->smartFileSystem->dumpFile($toFile, $fileContent);
}
}
}
When invoking this methodology to repeat all workflows downstream, I additionally substitute the worth of CHECKOUT_SUBMODULES:
/**
* Copy all workflows to `.github/`, and convert:
* `CHECKOUT_SUBMODULES: “”`
* into:
* `CHECKOUT_SUBMODULES: “recursive”`
*/
$regexReplacements = [
‘#CHECKOUT_SUBMODULES:(s+”.*”)?#’ => ‘CHECKOUT_SUBMODULES: “recursive”‘,
];
(new FileCopierSystem())->copyFilesFromFolder(
‘submodules/PoP/.github/workflows’,
‘.github/workflows’,
$regexReplacements
);
Workflow generate_plugins.yml wants a further substitute. When the WordPress plugin is generated, its code is downgraded from PHP 8.0 to 7.1 by invoking script ci/downgrade/downgrade_code.sh:
– title: Downgrade code for manufacturing (to PHP 7.1)
run: ci/downgrade/downgrade_code.sh “${{ matrix.pluginConfig.rector_downgrade_config }}” “” “${{ matrix.pluginConfig.path }}” “${{ matrix.pluginConfig.additional_rector_configs }}”
Within the downstream monorepo, this file will likely be positioned beneath submodules/PoP/ci/downgrade/downgrade_code.sh. Then, we’ve got the downstream workflow level to the proper path with this substitute:
$regexReplacements = [
// …
‘#(ci/downgrade/downgrade_code.sh)#’ => ‘submodules/PoP/$1’,
];
Configuring Packages In Monorepo Builder
File monorepo-builder.php — positioned on the root of the monorepo — holds the configuration for the Monorepo builder. In it we should point out the place the packages (and plugins, shoppers, or anything) are positioned:
use SymfonyComponentDependencyInjectionLoaderConfiguratorContainerConfigurator;
use SymplifyMonorepoBuilderValueObjectOption;
return static perform (ContainerConfigurator $containerConfigurator): void {
$parameters = $containerConfigurator->parameters();
$parameters->set(Choice::PACKAGE_DIRECTORIES, [
__DIR__ . ‘/packages’,
__DIR__ . ‘/plugins’,
]);
};
The personal monorepo will need to have entry to all code: its personal packages, plus these from the general public monorepo. Then, it should outline all packages from each monorepos within the config file. Those from the general public monorepo are positioned beneath “/submodules/PoP”:
return static perform (ContainerConfigurator $containerConfigurator): void {
$parameters = $containerConfigurator->parameters();
$parameters->set(Choice::PACKAGE_DIRECTORIES, [
// public code
__DIR__ . ‘/submodules/PoP/packages’,
__DIR__ . ‘/submodules/PoP/plugins’,
// private code
__DIR__ . ‘/packages’,
__DIR__ . ‘/plugins’,
__DIR__ . ‘/clients’,
]);
};
As it may be seen, the configuration for upstream and downstream are just about the identical, with the distinction that the downstream one will:
Change the trail to the general public packages.
Add the personal packages.
Then, it is smart to rewrite the configuration utilizing object-oriented programming, in order that we make code DRY (do not repeat your self) by having a PHP class within the public repo be prolonged within the personal repo.
Recreating The Configuration Through OOP
Let’s refactor the configuration. Within the public repo, file monorepo-builder.php will merely reference a brand new class ContainerConfigurationService the place all motion will occur:
use PoPPoPConfigSymplifyMonorepoBuilderConfiguratorsContainerConfigurationService;
use SymfonyComponentDependencyInjectionLoaderConfiguratorContainerConfigurator;
return static perform (ContainerConfigurator $containerConfigurator): void {
$containerConfigurationService = new ContainerConfigurationService(
$containerConfigurator,
__DIR__
);
$containerConfigurationService->configureContainer();
};
The __DIR__ param factors to the basis of the monorepo. It will likely be wanted to acquire the complete path to the bundle directories.
Class ContainerConfigurationService is now accountable for producing the configuration:
namespace PoPPoPConfigSymplifyMonorepoBuilderConfigurators;
use PoPPoPConfigSymplifyMonorepoBuilderDataSourcesPackageOrganizationDataSource;
use SymfonyComponentDependencyInjectionLoaderConfiguratorContainerConfigurator;
use SymplifyMonorepoBuilderValueObjectOption;
class ContainerConfigurationService
{
public perform __construct(
protected ContainerConfigurator $containerConfigurator,
protected string $rootDirectory,
) {
}
public perform configureContainer(): void
{
$parameters = $this->containerConfigurator->parameters();
if ($packageOrganizationConfig = $this->getPackageOrganizationDataSource($this->rootDirectory)) {
$parameters->set(
Choice::PACKAGE_DIRECTORIES,
$packageOrganizationConfig->getPackageDirectories()
);
}
}
protected perform getPackageOrganizationDataSource(): ?PackageOrganizationDataSource
{
return new PackageOrganizationDataSource($this->rootDirectory);
}
}
The configuration may be cut up throughout a number of courses. On this case, ContainerConfigurationService retrieves the bundle configuration by way of class PackageOrganizationDataSource, which has this implementation:
namespace PoPPoPConfigSymplifyMonorepoBuilderDataSources;
class PackageOrganizationDataSource
{
public perform __construct(protected string $rootDir)
{
}
public perform getPackageDirectories(): array
{
return array_map(
fn (string $packagePath) => $this->rootDir . ‘/’ . $packagePath,
$this->getRelativePackagePaths()
);
}
public perform getRelativePackagePaths(): array
{
return [
‘packages’,
‘plugins’,
];
}
}
Overriding The Configuration In The Downstream Monorepo
Now that the configuration within the public monorepo is setup by way of OOP, we are able to lengthen it to go well with the wants of the personal monorepo.
As a way to enable the personal monorepo to autoload the PHP code from the general public monorepo, we should first configure the downstream composer.json to reference the supply code from the upstream, which is beneath path submodules/PoP/src:
{
“autoload”: {
“psr-4”: {
“PoPGraphQLAPIPRO”: “src”,
“PoPPoP”: “submodules/PoP/src”
}
}
}
Beneath is file monorepo-builder.php for the personal monorepo. Discover that the referenced class ContainerConfigurationService within the upstream repo belongs to the PoPPoP namespace, however now it switched to the PoPGraphQLAPIPRO namespace. This class should obtain the extra enter $upstreamRelativeRootPath (with worth “submodules/PoP”) as to recreate the complete path to the general public packages:
use PoPGraphQLAPIPROConfigSymplifyMonorepoBuilderConfiguratorsContainerConfigurationService;
use SymfonyComponentDependencyInjectionLoaderConfiguratorContainerConfigurator;
return static perform (ContainerConfigurator $containerConfigurator): void {
$containerConfigurationService = new ContainerConfigurationService(
$containerConfigurator,
__DIR__,
‘submodules/PoP’
);
$containerConfigurationService->configureContainer();
};
The downstream class ContainerConfigurationService overrides which PackageOrganizationDataSource class is used within the configuration:
namespace PoPGraphQLAPIPROConfigSymplifyMonorepoBuilderConfigurators;
use PoPPoPConfigSymplifyMonorepoBuilderConfiguratorsContainerConfigurationService as UpstreamContainerConfigurationService;
use PoPGraphQLAPIPROConfigSymplifyMonorepoBuilderDataSourcesPackageOrganizationDataSource;
use SymfonyComponentDependencyInjectionLoaderConfiguratorContainerConfigurator;
class ContainerConfigurationService extends UpstreamContainerConfigurationService
{
public perform __construct(
ContainerConfigurator $containerConfigurator,
string $rootDirectory,
protected string $upstreamRelativeRootPath
) {
dad or mum::__construct(
$containerConfigurator,
$rootDirectory
);
}
protected perform getPackageOrganizationDataSource(): ?PackageOrganizationDataSource
{
return new PackageOrganizationDataSource(
$this->rootDirectory,
$this->upstreamRelativeRootPath
);
}
}
Lastly, downstream class PackageOrganizationDataSource comprises the complete path to each private and non-private packages:
namespace PoPGraphQLAPIPROConfigSymplifyMonorepoBuilderDataSources;
use PoPPoPConfigSymplifyMonorepoBuilderDataSourcesPackageOrganizationDataSource as UpstreamPackageOrganizationDataSource;
class PackageOrganizationDataSource extends UpstreamPackageOrganizationDataSource
{
public perform __construct(
string $rootDir,
protected string $upstreamRelativeRootPath
) {
dad or mum::__construct($rootDir);
}
public perform getRelativePackagePaths(): array
{
return array_merge(
// Public packages – Prepend them with “submodules/PoP/”
array_map(
fn ($upstreamPackagePath) => $this->upstreamRelativeRootPath . ‘/’ . $upstreamPackagePath,
dad or mum::getRelativePackagePaths()
),
// Non-public packages
[
‘packages’,
‘plugins’,
‘clients’,
]
);
}
}
Injecting The Configuration From PHP Into GitHub Actions
Monorepo builder affords command packages-json, which we are able to use to inject the bundle paths into the GitHub Actions workflow:
jobs:
provide_data:
steps:
– id: output_data
title: Calculate matrix for packages
run: |
echo “::set-output title=matrix::$(vendor/bin/monorepo-builder packages-json)”
outputs:
matrix: ${{ steps.output_data.outputs.matrix }}
This command produces a stringified JSON. Within the workflow it have to be transformed to a JSON object by way of fromJson:
jobs:
split_monorepo:
wants: provide_data
technique:
matrix:
bundle: ${{ fromJson(wants.provide_data.outputs.matrix) }}
Sadly, command packages-json outputs the bundle names however not their paths, which works when all packages are beneath the identical folder (equivalent to packages/). It does not work in our case, since private and non-private packages are positioned in numerous folders.
Happily, the Monorepo builder may be prolonged with customized PHP companies. So I created a customized command package-entries-json (by way of class PackageEntriesJsonCommand) which does output the trail to the bundle.
The workflow was then up to date with the brand new command:
run: |
echo “::set-output title=matrix::$(vendor/bin/monorepo-builder package-entries-json)”
Executed on the general public monorepo, it produces the next packages (amongst many others):
[{
“name”: “graphql-api-for-wp”,
“path”: “layers/GraphQLAPIForWP/plugins/graphql-api-for-wp”
},
{
“name”: “extension-demo”,
“path”: “layers/GraphQLAPIForWP/plugins/extension-demo”
},
{
“name”: “access-control”,
“path”: “layers/Engine/packages/access-control”
},
{
“name”: “api”,
“path”: “layers/API/packages/api”
},
{
“name”: “api-clients”,
“path”: “layers/API/packages/api-clients”
}
]
Executed on the personal monorepo, it produces the next entries (amongst many others):
[{
“name”: “graphql-api-for-wp”,
“path”: “submodules/PoP/layers/GraphQLAPIForWP/plugins/graphql-api-for-wp”
},
{
“name”: “extension-demo”,
“path”: “submodules/PoP/layers/GraphQLAPIForWP/plugins/extension-demo”
},
{
“name”: “access-control”,
“path”: “submodules/PoP/layers/Engine/packages/access-control”
},
{
“name”: “api”,
“path”: “submodules/PoP/layers/API/packages/api”
},
{
“name”: “api-clients”,
“path”: “submodules/PoP/layers/API/packages/api-clients”
},
{
“name”: “graphql-api-pro”,
“path”: “layers/GraphQLAPIForWP/plugins/graphql-api-pro”
},
{
“name”: “convert-case-directives”,
“path”: “layers/Schema/packages/convert-case-directives”
},
{
“name”: “export-directive”,
“path”: “layers/GraphQLByPoP/packages/export-directive”
}
]
As it may be appreciated, it really works nicely: the configuration for the downstream monorepo comprises each private and non-private packages, and the paths to the general public ones have been prepended with “submodules/PoP”.
Skipping Public Packages In The Downstream Monorepo
To this point, the downstream monorepo has included each private and non-private packages in its configuration. Nonetheless, not each command must be executed on the general public packages.
Take static evaluation, as an illustration. The general public monorepo already executes PHPStan on all public packages by way of workflow phpstan.yml, as proven in this run. If the downstream monorepo runs as soon as once more PHPStan on the general public packages, it’s a waste of computing time. Then, the phpstan.yml workflow must run on the personal packages solely.
That signifies that relying on the command to execute within the downstream repo, we could wish to both embody each private and non-private packages, or solely personal ones.
So as to add public packages or not on the downstream configuration, we adapt downstream class PackageOrganizationDataSource to examine this situation by way of enter $includeUpstreamPackages:
namespace PoPGraphQLAPIPROConfigSymplifyMonorepoBuilderDataSources;
use PoPPoPConfigSymplifyMonorepoBuilderDataSourcesPackageOrganizationDataSource as UpstreamPackageOrganizationDataSource;
class PackageOrganizationDataSource extends UpstreamPackageOrganizationDataSource
{
public perform __construct(
string $rootDir,
protected string $upstreamRelativeRootPath,
protected bool $includeUpstreamPackages
) {
dad or mum::__construct($rootDir);
}
public perform getRelativePackagePaths(): array
{
return array_merge(
// Add the general public packages?
$this->includeUpstreamPackages ?
// Public packages – Prepend them with “submodules/PoP/”
array_map(
fn ($upstreamPackagePath) => $this->upstreamRelativeRootPath . ‘/’ . $upstreamPackagePath,
dad or mum::getRelativePackagePaths()
) : [],
// Non-public packages
[
‘packages’,
‘plugins’,
‘clients’,
]
);
}
}
Subsequent, we have to present worth $includeUpstreamPackages as both true or false relying on the command to execute.
We are able to do that by changing config file monorepo-builder.php with two different config recordsdata: monorepo-builder-with-upstream-packages.php (which passes $includeUpstreamPackages => true) and monorepo-builder-without-upstream-packages.php (which passes $includeUpstreamPackages => false):
// File monorepo-builder-without-upstream-packages.php
use PoPGraphQLAPIPROConfigSymplifyMonorepoBuilderConfiguratorsContainerConfigurationService;
use SymfonyComponentDependencyInjectionLoaderConfiguratorContainerConfigurator;
return static perform (ContainerConfigurator $containerConfigurator): void {
$containerConfigurationService = new ContainerConfigurationService(
$containerConfigurator,
__DIR__,
‘submodules/PoP’,
false, // That is $includeUpstreamPackages
);
$containerConfigurationService->configureContainer();
};
We then replace ContainerConfigurationService to obtain parameter $includeUpstreamPackages and cross it alongside to PackageOrganizationDataSource:
namespace PoPGraphQLAPIPROConfigSymplifyMonorepoBuilderConfigurators;
use PoPPoPConfigSymplifyMonorepoBuilderConfiguratorsContainerConfigurationService as UpstreamContainerConfigurationService;
use PoPGraphQLAPIPROConfigSymplifyMonorepoBuilderDataSourcesPackageOrganizationDataSource;
use SymfonyComponentDependencyInjectionLoaderConfiguratorContainerConfigurator;
class ContainerConfigurationService extends UpstreamContainerConfigurationService
{
public perform __construct(
ContainerConfigurator $containerConfigurator,
string $rootDirectory,
protected string $upstreamRelativeRootPath,
protected bool $includeUpstreamPackages,
) {
dad or mum::__construct(
$containerConfigurator,
$rootDirectory,
);
}
protected perform getPackageOrganizationDataSource(): ?PackageOrganizationDataSource
{
return new PackageOrganizationDataSource(
$this->rootDirectory,
$this->upstreamRelativeRootPath,
$this->includeUpstreamPackages,
);
}
}
Subsequent, we should always invoke the monorepo-builder with both config file, by offering the –config choice:
jobs:
provide_data:
steps:
– id: output_data
title: Calculate matrix for packages
run: |
echo “::set-output title=matrix::$(vendor/bin/monorepo-builder package-entries-json –config=monorepo-builder-without-upstream-packages.php)”
Nonetheless, as we noticed earlier on, we wish to hold the GitHub Actions workflows within the upstream monorepo as the one supply of fact, they usually clearly don’t want these modifications.
The answer I discovered to this concern is to supply a –config choice within the upstream repo at all times, with every command getting its personal config file, such because the validate command receiving the validate.php config file:
– title: Run validation
run: vendor/bin/monorepo-builder validate –config=config/monorepo-builder/validate.php
Now, there aren’t any config recordsdata within the upstream monorepo, because it does not want them. But it surely won’t break, as a result of the Monorepo builder checks if the config file exists and, if it doesn’t, it hundreds the default config file as a substitute. So we’ll both override the config, or nothing occurs.
The downstream repo does present the config recordsdata for every command, specifying if so as to add the upstream packages or not:
Btw, as a aspect word, that is one other instance of how the multi-monorepo leaks.
// File config/monorepo-builder/validate.php
return require_once __DIR__ . ‘/monorepo-builder-with-upstream-packages.php’;
Overriding The Configuration
We’re nearly finished. By now the downstream monorepo can override the configuration from the upstream monorepo. So all that is left to do is to supply the brand new configuration.
In school PluginDataSource I override the configuration of which WordPress plugins have to be generated, offering the PRO ones as a substitute:
namespace PoPGraphQLAPIPROConfigSymplifyMonorepoBuilderDataSources;
use PoPPoPConfigSymplifyMonorepoBuilderDataSourcesPluginDataSource as UpstreamPluginDataSource;
class PluginDataSource extends UpstreamPluginDataSource
{
public perform getPluginConfigEntries(): array
{
return [
// GraphQL API PRO
[
‘path’ => ‘layers/GraphQLAPIForWP/plugins/graphql-api-pro’,
‘zip_file’ => ‘graphql-api-pro.zip’,
‘main_file’ => ‘graphql-api-pro.php’,
‘dist_repo_organization’ => ‘GraphQLAPI-PRO’,
‘dist_repo_name’ => ‘graphql-api-pro-dist’,
],
// GraphQL API Extensions
// Google Translate
[
‘path’ => ‘layers/GraphQLAPIForWP/plugins/google-translate’,
‘zip_file’ => ‘graphql-api-google-translate.zip’,
‘main_file’ => ‘graphql-api-google-translate.php’,
‘dist_repo_organization’ => ‘GraphQLAPI-PRO’,
‘dist_repo_name’ => ‘graphql-api-google-translate-dist’,
],
// Occasions Supervisor
[
‘path’ => ‘layers/GraphQLAPIForWP/plugins/events-manager’,
‘zip_file’ => ‘graphql-api-events-manager.zip’,
‘main_file’ => ‘graphql-api-events-manager.php’,
‘dist_repo_organization’ => ‘GraphQLAPI-PRO’,
‘dist_repo_name’ => ‘graphql-api-events-manager-dist’,
],
];
}
}
Creating a brand new launch on GitHub will set off the generate_plugins.yml workflow and generate the PRO plugins on my personal monorepo:
Tadaaaaaaaa! 🎉
Conclusion
As at all times, there isn’t any “finest” resolution, solely options that will work higher relying on the context. The multi-monorepo method just isn’t appropriate to each form of venture or group. I consider the largest beneficiaries are plugin creators who launch public plugins to be upgraded to their PRO variations, and businesses customizing plugins for his or her shoppers.
In my case, I am fairly proud of this method. It takes a little bit of effort and time to get proper, however it’s a one-off funding. As soon as the set-up is over, I can simply give attention to constructing my PRO plugins, and the time financial savings regarding venture administration may be large.
Subscribe to MarketingSolution.
Receive web development discounts & web design tutorials.
Now! Lets GROW Together!