Skip to content

Commit

Permalink
feat: string replacements in deploy (#748)
Browse files Browse the repository at this point in the history
  • Loading branch information
mshanemc authored Oct 31, 2022
1 parent 625eb44 commit a23c6b3
Show file tree
Hide file tree
Showing 46 changed files with 981 additions and 237 deletions.
18 changes: 14 additions & 4 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,20 +11,30 @@ on:
jobs:
unit-tests:
uses: salesforcecli/github-workflows/.github/workflows/unitTest.yml@main
nuts:
uses: salesforcecli/github-workflows/.github/workflows/nut.yml@main
secrets: inherit
strategy:
matrix:
os: [ubuntu-latest, windows-latest]
fail-fast: false
with:
os: ${{ matrix.os }}

perf-scale-nuts-linux:
uses: ./.github/workflows/perfScaleNut.yml
needs: unit-tests
needs: [unit-tests, nuts]
perf-scale-nuts-windows:
uses: ./.github/workflows/perfScaleNut.yml
needs: unit-tests
needs: [unit-tests, nuts]
with:
os: 'windows-latest'

# run a quick nut on each OS to populate the cache
# the following is highly duplicative to allow linux to start all the nuts without waiting for windows primer
extNuts-primer-linux:
name: extNUTs-linux-prime
needs: unit-tests
needs: [unit-tests, nuts]
uses: salesforcecli/github-workflows/.github/workflows/externalNut.yml@main
with:
packageName: '@salesforce/source-deploy-retrieve'
Expand All @@ -38,7 +48,7 @@ jobs:

extNuts-primer-windows:
name: extNUTs-windows-prime
needs: unit-tests
needs: [unit-tests, nuts]
uses: salesforcecli/github-workflows/.github/workflows/externalNut.yml@main
with:
packageName: '@salesforce/source-deploy-retrieve'
Expand Down
9 changes: 1 addition & 8 deletions HANDBOOK.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,6 @@
- [Overview](#overview-2)
- [Converting metadata](#converting-metadata)
- [The conversion pipeline](#the-conversion-pipeline)
- [ComponentReader](#componentreader)
- [ComponentConverter](#componentconverter)
- [ComponentWriter](#componentwriter)
- [ConvertContext](#convertcontext)
Expand Down Expand Up @@ -214,7 +213,7 @@ A `TreeContainer` is an encapsulation of a file system that enables I/O against

Clients can implement new tree containers by extending the `TreeContainer` base class and expanding functionality. Not all methods of a tree container have to be implemented, but an error will be thrown if the container is being used in a context that requires particular methods.

💡*The author, Brian, demonstrated the extensibility of tree containers for a side project by creating a* `GitTreeContainer`_. This enabled resolving components against a git object tree, allowing us to perform component diffs between git refs and analyze GitHub projects. See the [SFDX Badge Generator](https://sfdx-badge.herokuapp.com/). This could be expanded into a plugin of some sort._
💡_The author, Brian, demonstrated the extensibility of tree containers for a side project by creating a_ `GitTreeContainer`_. This enabled resolving components against a git object tree, allowing us to perform component diffs between git refs and analyze GitHub projects. See the [SFDX Badge Generator](https://sfdx-badge.herokuapp.com/). This could be expanded into a plugin of some sort._

#### Creating mock components with the VirtualTreeContainer

Expand Down Expand Up @@ -315,12 +314,6 @@ const converter = new MetadataConverter();

When `convert` is called, the method prepares the inputs for setting up the conversion pipeline. The pipeline consists of chaining three custom NodeJS stream, one for each stage of the copy operation. To more deeply understand what is happening in the conversion process, it’s recommended to familiarize yourself with streaming concepts and the NodeJS API. See [Stream NodeJS documentation](https://nodejs.org/api/stream.html) and [Understanding Streams in NodeJS](https://nodesource.com/blog/understanding-streams-in-nodejs/).

#### ComponentReader

The reader is fairly simple, it takes a collection of source components and implements the stream API to push them out one-by-one.

🧽 _When this aspect of the library was first written,_ `Readable.from(iterable)` _was not yet available. This simple API could probably replace the_ `ComponentReader`_._

#### ComponentConverter

Here is where file transformation is done, but without being written to the destination yet. Similar to how source resolution uses adapters to determine how to construct components for a type (see [The resolver constructs components based…](#resolving-from-metadata-files)), conversion uses `MetadataTransformer` implementations to describe the transformations. As you might guess, types are assigned a transformer, if they need one, in their metadata registry definition, otherwise the default one is used. Each transformer implements a `toSourceFormat` and a `toMetadataFormat` method, which are called by the `ComponentConverter` based on what the target format is. The methods will return a collection of `WriteInfo` objects, which as we’ve been touching on are “descriptions” of how to write a given file.
Expand Down
5 changes: 4 additions & 1 deletion package.json
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,7 @@
"graceful-fs": "^4.2.10",
"ignore": "^5.2.0",
"mime": "2.6.0",
"minimatch": "^5.1.0",
"proxy-agent": "^5.0.0",
"proxy-from-env": "^1.1.0",
"unzipper": "0.10.11"
Expand All @@ -47,6 +48,7 @@
"@types/archiver": "^5.3.1",
"@types/deep-equal-in-any-order": "^1.0.1",
"@types/mime": "2.0.3",
"@types/minimatch": "^5.1.2",
"@types/proxy-from-env": "^1.0.1",
"@types/shelljs": "^0.8.11",
"@types/unzipper": "^0.10.5",
Expand Down Expand Up @@ -98,6 +100,7 @@
"pretest": "sf-compile-test",
"repl": "node --inspect ./scripts/repl.js",
"test": "sf-test",
"test:nuts": "mocha \"test/nuts/local/**/*.nut.ts\" --timeout 500000",
"test:nuts:scale": "mocha \"test/nuts/scale/eda.nut.ts\" --timeout 500000; mocha \"test/nuts/scale/lotsOfClasses.nut.ts\" --timeout 500000; mocha \"test/nuts/scale/lotsOfClassesOneDir.nut.ts\" --timeout 500000",
"test:nuts:scale:record": "yarn test:nuts:scale && git add . && git commit -m \"test: record perf [ci skip]\" --no-verify && git push --no-verify",
"test:registry": "mocha ./test/registry/registryCompleteness.test.ts --timeout 50000",
Expand All @@ -114,4 +117,4 @@
"yarn": "1.22.4"
},
"config": {}
}
}
35 changes: 26 additions & 9 deletions src/client/metadataApiDeploy.ts
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@ import { create as createArchive } from 'archiver';
import * as fs from 'graceful-fs';
import { Lifecycle, Messages, SfError } from '@salesforce/core';
import { ensureArray } from '@salesforce/kit';
import { ReplacementEvent } from '../convert/types';
import { MetadataConverter } from '../convert';
import { ComponentLike, SourceComponent } from '../resolve';
import { ComponentSet } from '../collections';
Expand All @@ -31,16 +32,15 @@ Messages.importMessagesDirectory(__dirname);
const messages = Messages.load('@salesforce/source-deploy-retrieve', 'sdr', ['error_no_job_id']);

export class DeployResult implements MetadataTransferResult {
public readonly response: MetadataApiDeployStatus;
public readonly components: ComponentSet;
private readonly diagnosticUtil = new DiagnosticUtil('metadata');
private fileResponses: FileResponse[];
private readonly shouldConvertPaths = sep !== posix.sep;

public constructor(response: MetadataApiDeployStatus, components: ComponentSet) {
this.response = response;
this.components = components;
}
public constructor(
public readonly response: MetadataApiDeployStatus,
public readonly components: ComponentSet,
public readonly replacements: Map<string, string[]> = new Map<string, string[]>()
) {}

public getFileResponses(): FileResponse[] {
// this involves FS operations, so only perform once!
Expand Down Expand Up @@ -236,6 +236,7 @@ export class MetadataApiDeploy extends MetadataTransfer<MetadataApiDeployStatus,
},
};
private options: MetadataApiDeployOptions;
private replacements: Map<string, string[]> = new Map();
private orgId: string;
// Keep track of rest deploys separately since Connection.deploy() removes it
// from the apiOptions and we need it for telemetry.
Expand Down Expand Up @@ -310,6 +311,7 @@ export class MetadataApiDeploy extends MetadataTransfer<MetadataApiDeployStatus,
}

protected async pre(): Promise<AsyncResult> {
const LifecycleInstance = Lifecycle.getInstance();
const connection = await this.getConnection();
// store for use in the scopedPostDeploy event
this.orgId = connection.getAuthInfoFields().orgId;
Expand All @@ -320,11 +322,26 @@ export class MetadataApiDeploy extends MetadataTransfer<MetadataApiDeployStatus,
}
// only do event hooks if source, (NOT a metadata format) deploy
if (this.options.components) {
await Lifecycle.getInstance().emit('scopedPreDeploy', {
await LifecycleInstance.emit('scopedPreDeploy', {
componentSet: this.options.components,
orgId: this.orgId,
} as ScopedPreDeploy);
}

LifecycleInstance.on(
'replacement',
async (replacement: ReplacementEvent) =>
// lifecycle have to be async, so wrapped in a promise
new Promise((resolve) => {
if (!this.replacements.has(replacement.filename)) {
this.replacements.set(replacement.filename, [replacement.replaced]);
} else {
this.replacements.get(replacement.filename).push(replacement.replaced);
}
resolve();
})
);

const [zipBuffer] = await Promise.all([this.getZipBuffer(), this.maybeSaveTempDirectory('metadata')]);
// SDR modifies what the mdapi expects by adding a rest param
const { rest, ...optionsWithoutRest } = this.options.apiOptions;
Expand Down Expand Up @@ -370,7 +387,7 @@ export class MetadataApiDeploy extends MetadataTransfer<MetadataApiDeployStatus,
`Error trying to compile/send deploy telemetry data for deploy ID: ${this.id}\nError: ${error.message}`
);
}
const deployResult = new DeployResult(result, this.components);
const deployResult = new DeployResult(result, this.components, this.replacements);
// only do event hooks if source, (NOT a metadata format) deploy
if (this.options.components) {
await lifecycle.emit('scopedPostDeploy', { deployResult, orgId: this.orgId } as ScopedPostDeploy);
Expand All @@ -387,7 +404,7 @@ export class MetadataApiDeploy extends MetadataTransfer<MetadataApiDeployStatus,
const zip = createArchive('zip', { zlib: { level: 9 } });
// anywhere not at the root level is fine
zip.directory(this.options.mdapiPath, 'zip');
void zip.finalize();
await zip.finalize();
return stream2buffer(zip);
}
// read the zip into a buffer
Expand Down
1 change: 0 additions & 1 deletion src/client/types.ts
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,6 @@ interface FileResponseFailure extends FileResponseBase {
}

export type FileResponse = FileResponseSuccess | FileResponseFailure;

export interface MetadataTransferResult {
response: MetadataRequestStatus;
components: ComponentSet;
Expand Down
13 changes: 4 additions & 9 deletions src/convert/convertContext.ts
Original file line number Diff line number Diff line change
Expand Up @@ -206,15 +206,10 @@ class NonDecompositionFinalizer extends ConvertTransactionFinalizer<NonDecomposi

// nondecomposed metadata types can exist in multiple locations under the same name
// so we have to find all components that could potentially match inbound components
let allNonDecomposed: SourceComponent[];

if (pkgPaths.includes(defaultDirectory)) {
allNonDecomposed = this.getAllComponentsOfType(pkgPaths, this.transactionState.exampleComponent.type.name);
} else {
// defaultDirectory isn't a package, assumes it's the target output dir for conversion
// so no need to scan this folder
allNonDecomposed = [];
}
const allNonDecomposed = pkgPaths.includes(defaultDirectory)
? this.getAllComponentsOfType(pkgPaths, this.transactionState.exampleComponent.type.name)
: // defaultDirectory isn't a package, assume it's the target output dir for conversion so don't scan folder
[];

// prepare 3 maps to simplify component merging
await this.initMergeMap(allNonDecomposed);
Expand Down
21 changes: 15 additions & 6 deletions src/convert/metadataConverter.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@
* Licensed under the BSD 3-Clause license.
* For full license text, see LICENSE.txt file in the repo root or https://opensource.org/licenses/BSD-3-Clause
*/
import { Readable, PassThrough } from 'stream';
import { dirname, join, normalize } from 'path';
import { Messages, SfError } from '@salesforce/core';
import { promises } from 'graceful-fs';
Expand All @@ -12,8 +13,9 @@ import { ensureDirectoryExists } from '../utils/fileSystemHandler';
import { SourcePath } from '../common';
import { ComponentSet, DestructiveChangesType } from '../collections';
import { RegistryAccess } from '../registry';
import { ComponentConverter, ComponentReader, pipeline, StandardWriter, ZipWriter } from './streams';
import { ComponentConverter, pipeline, StandardWriter, ZipWriter } from './streams';
import { ConvertOutputConfig, ConvertResult, DirectoryConfig, SfdxFileFormat, ZipConfig } from './types';
import { getReplacementMarkingStream } from './replacements';

Messages.importMessagesDirectory(__dirname);
const messages = Messages.load('@salesforce/source-deploy-retrieve', 'sdr', [
Expand All @@ -32,6 +34,7 @@ export class MetadataConverter {
public constructor(registry = new RegistryAccess()) {
this.registry = registry;
}
// eslint-disable-next-line complexity
public async convert(
comps: ComponentSet | Iterable<SourceComponent>,
targetFormat: SfdxFileFormat,
Expand All @@ -43,7 +46,7 @@ export class MetadataConverter {
(comps instanceof ComponentSet ? Array.from(comps.getSourceComponents()) : comps) as SourceComponent[]
).filter((comp) => comp.type.isAddressable !== false);

const isSource = targetFormat === 'source';
const targetFormatIsSource = targetFormat === 'source';
const tasks: Array<Promise<void>> = [];

let writer: StandardWriter | ZipWriter;
Expand All @@ -59,7 +62,7 @@ export class MetadataConverter {
packagePath = getPackagePath(output);
defaultDirectory = packagePath;
writer = new StandardWriter(packagePath);
if (!isSource) {
if (!targetFormatIsSource) {
const manifestPath = join(packagePath, MetadataConverter.PACKAGE_XML_FILE);
tasks.push(
promises.writeFile(manifestPath, await cs.getPackageXml()),
Expand All @@ -78,13 +81,16 @@ export class MetadataConverter {
if (output.packageName) {
cs.fullName = output.packageName;
}

packagePath = getPackagePath(output);
defaultDirectory = packagePath;
writer = new ZipWriter(packagePath);
if (!isSource) {
if (!targetFormatIsSource) {
writer.addToZip(await cs.getPackageXml(), MetadataConverter.PACKAGE_XML_FILE);

// for each of the destructive changes in the component set, convert and write the correct metadata
// to each manifest

for (const destructiveChangeType of cs.getTypesOfDestructiveChanges()) {
writer.addToZip(
// TODO: can this be safely parallelized?
Expand All @@ -96,7 +102,7 @@ export class MetadataConverter {
}
break;
case 'merge':
if (!isSource) {
if (!targetFormatIsSource) {
throw new SfError(messages.getMessage('error_merge_metadata_target_unsupported'));
}
defaultDirectory = output.defaultDirectory;
Expand All @@ -111,7 +117,10 @@ export class MetadataConverter {
}

const conversionPipeline = pipeline(
new ComponentReader(components),
Readable.from(components),
!targetFormatIsSource && (process.env.SF_APPLY_REPLACEMENTS_ON_CONVERT === 'true' || output.type === 'zip')
? (await getReplacementMarkingStream()) ?? new PassThrough({ objectMode: true })
: new PassThrough({ objectMode: true }),
new ComponentConverter(targetFormat, this.registry, mergeSet, defaultDirectory),
writer
);
Expand Down
Loading

0 comments on commit a23c6b3

Please sign in to comment.