Skip to content

Commit

Permalink
feat: ignore package directories
Browse files Browse the repository at this point in the history
  • Loading branch information
mcarvin8 committed Jan 14, 2025
1 parent c5a39c7 commit 2425eee
Show file tree
Hide file tree
Showing 8 changed files with 141 additions and 61 deletions.
54 changes: 54 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,54 @@
# Contributing

Contributions are welcome! If you would like to contribute, please fork the repository, make your changes, and submit a pull request.

## Requirements

- Node >= 18.0.0
- yarn

## Installation

### 1) Download the repository

```bash
git clone [email protected]:mcarvin8/sf-decomposer.git
```

### 2) Install Dependencies

This will install all the tools needed to contribute

```bash
yarn
```

### 3) Build application

```bash
yarn build
```

Rebuild every time you made a change in the source and you need to test locally

## Testing

When developing, run the provided tests for new additions.

```bash
# run unit tests
yarn test
```

To run the non-unit test, ensure you re-build the application and then run:

```bash
# run non-unit tests
yarn test:nuts
```

## Unique ID Elements

Unique ID elements are used to name decomposed files (nested elements). The file that contains the leaf elements will always match the original file-name.

To add more unique ID elements for a metadata type, you can update the `src/metadata/uniqueIdElements.json` file. The metadata type's suffix should be used as the key.
100 changes: 48 additions & 52 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,8 +19,6 @@
- [Ignore Files](#ignore-files)
- [`.forceignore` updates](#.forceignore-updates)
- [`.gitignore` updates](#.gitignore-updates)
- [Contributing](#contributing)
- [Unique ID Elements](#unique-id-elements)
- [Issues](#issues)
- [License](#license)
</details>
Expand All @@ -43,11 +41,11 @@ sf plugins install [email protected]

## Why Use this Plugin?

Why should you consider using this Salesforce CLI Plugin over Salesforce's decomposition:
Why should you consider using this Salesforce CLI Plugin over Salesforce's decomposition?

- Salesforce's decomposition betas are evaluated for each metadata type before they are considered. My plugin supports the vast majority of Salesforce metadata types available from the Metadata API.
- Salesforce's decomposition is all or nothing for each metadata type. Meaning, if you want to decompose workflows, all of your workflows will need to be decomposed to work with Salesforce's approach. My plugin allows you to selectively decompose for each metadata type.
- See [Ignore Files when Decomposing](#ignore-files-when-decomposing)
- See [Ignore Files when Decomposing](#ignore-files-when-decomposing)
- Some metadata types may only be partially decomposed by Salesforce such as permission sets based on what designs are picked. My plugin will allow for total decomposition. So if a user wants to fully decompose permission sets, they can use this plugin.

## Commands
Expand All @@ -59,36 +57,41 @@ The `sf-decomposer` supports 2 commands:

## `sf decomposer decompose`

Decomposes the original metadata files in all local package directories into smaller files for version control.
Decomposes the original metadata files in all local package directories into smaller files for version control. If unique ID elements are found, the decomposed files will be named using them. Otherwise, the decomposed files will be named with the SHA-256 hash of the element contents.

<img src="https://raw.githubusercontent.com/mcarvin8/sf-decomposer/main/.github/images/decomposed-perm-set.png">
Decomposed Permission Sets named using unique ID elements

<br>

<img src="https://raw.githubusercontent.com/mcarvin8/sf-decomposer/main/.github/images/decomposed-labels.png">
Decomposed Custom Labels named using unique ID elements

<br>

<img src="https://raw.githubusercontent.com/mcarvin8/sf-decomposer/main/.github/images/decomposed-apps-hashes.png">
Decomposed Application named using SHA-256 hashes of elements

<br>

```
USAGE
$ sf decomposer decompose -m <value> -f <value> [--prepurge --postpurge --debug --json]
$ sf decomposer decompose -m <value> -f <value> -i <value> [--prepurge --postpurge --debug --json]
FLAGS
-m, --metadata-type=<value> The metadata suffix to process, such as 'flow', 'labels', etc. You can provide this flag multiple times to process multiple metadata types with a single command.
-f, --format=<value> [default: 'xml'] The file type for the decomposed files. Must match what format you provide for recompose.
--prepurge [default: false] If provided, purge directories of pre-existing decomposed files.
--postpurge [default: false] If provided, purge the original files after decomposing them.
--debug [default: false] If provided, log debugging results to a text file (disassemble.log).
-m, --metadata-type=<value> The metadata suffix to process, such as 'flow', 'labels', etc. You can provide this flag multiple times to process multiple metadata types with a single command.
-f, --format=<value> [default: 'xml'] The file type for the decomposed files. Must match what format you provide for recompose.
-i, --ignore-package-directory=<value> Package directories to ignore. Should be as they appear in the "sfdx-project.json" file.
Can be declared multiple times.
--prepurge [default: false] If provided, purge directories of pre-existing decomposed files.
--postpurge [default: false] If provided, purge the original files after decomposing them.
--debug [default: false] If provided, log debugging results to a text file (disassemble.log).
GLOBAL FLAGS
--json Format output as json.
DESCRIPTION
Decomoose large metadata files into smaller files.
Decompose large metadata files into smaller files.
You should run this after you retrieve metadata from an org.
Expand All @@ -101,6 +104,10 @@ EXAMPLES
$ sf decomposer decompose -m "flow" -m "labels" -f "xml" --prepurge --postpurge --debug
Decompose flows except for those in the "force-app" package directory.
$ sf decomposer decompose -m "flow" -i "force-app"
```

## `sf decomposer recompose`
Expand All @@ -109,19 +116,21 @@ Recompose decomposed files into deployment-compatible files.

```
USAGE
$ sf decomposer recompose -m <value> -f <value> [--postpurge --debug --json]
$ sf decomposer recompose -m <value> -f <value> -i <value> [--postpurge --debug --json]
FLAGS
-m, --metadata-type=<value> The metadata suffix to process, such as 'flow', 'labels', etc. You can provide this flag multiple times to process multiple metadata types with a single command.
-f, --format=<value> [default: 'xml'] The file format for the decomposed files.
--postpurge [default: false] If provided, purge the decomposed files after recomposing them.
--debug [default: false] If provided, log debugging results to a text file (disassemble.log).
-m, --metadata-type=<value> The metadata suffix to process, such as 'flow', 'labels', etc. You can provide this flag multiple times to process multiple metadata types with a single command.
-f, --format=<value> [default: 'xml'] The file format for the decomposed files.
-i, --ignore-package-directory=<value> Package directories to ignore. Should be as they appear in the "sfdx-project.json" file.
Can be declared multiple times.
--postpurge [default: false] If provided, purge the decomposed files after recomposing them.
--debug [default: false] If provided, log debugging results to a text file (disassemble.log).
GLOBAL FLAGS
--json Format output as json.
DESCRIPTION
This command will read the decomposed files and recreate deployment-compatible metadata files in each package directory.
Recompose the decomposed files into deployment-compatible metadata files.
You should run this before you deploy decomposed metadata to an org.
Expand All @@ -134,6 +143,10 @@ EXAMPLES
$ sf decomposer recompose -m "flow" -m "labels" -f "xml" --postpurge --debug
Recompose flows except for those in the "force-app" package directory.
$ sf decomposer recompose -m "flow" -i "force-app"
```

## Supported Metadata
Expand Down Expand Up @@ -170,29 +183,22 @@ Here are some examples:

### Metadata Exceptions

`botVersion` is blocked from being ran directly. Please use the `bot` meta suffix to decompose and recompose bots and bot versions.

```
Error (1): `botVersion` suffix should not be used. Please use `bot` to decompose/recompose bot and bot version files.
```

Custom Objects are not supported by this plugin.

```
Error (1): Custom Objects are not supported by this plugin.
```

Metadata types such as Apex Classes, Apex Components, Triggers, etc. with certain SDR adapter strategies (`matchingContentFile`, `digitalExperience`, `mixedContent`, `bundle`) are not supported by this plugin.

```
Error (1): Metadata types with [matchingContentFile, digitalExperience, mixedContent, bundle] strategies are not supported by this plugin.
```

Children metadata types (ex: custom fields) are not supported and will result in this general error:

```
Error (1): Metadata type not found for the given suffix: field.
```
- `botVersion` is blocked from being ran directly. Please use the `bot` meta suffix to decompose and recompose bots and bot versions.
```
Error (1): `botVersion` suffix should not be used. Please use `bot` to decompose/recompose bot and bot version files.
```
- Custom Objects are not supported by this plugin.
```
Error (1): Custom Objects are not supported by this plugin.
```
- Metadata types such as Apex Classes, Apex Components, Triggers, etc. with certain SDR adapter strategies (`matchingContentFile`, `digitalExperience`, `mixedContent`, `bundle`) are not supported by this plugin.
```
Error (1): Metadata types with [matchingContentFile, digitalExperience, mixedContent, bundle] strategies are not supported by this plugin.
```
- Children metadata types (i.e. custom fields) are not supported and will result in this general error:
```
Error (1): Metadata type not found for the given suffix: field.
```

## Warnings and Logging

Expand Down Expand Up @@ -224,9 +230,7 @@ Recommend adding the `disassemble.log` to your `.gitignore` file if you are usin

## Ignore Files when Decomposing

If you wish, you can create an ignore file in the root of your repository named `.sfdecomposerignore` to ignore specific XMLs when running the decompose command.

The ignore file should follow [.gitignore spec 2.22.1](https://git-scm.com/docs/gitignore).
If you wish, you can create an ignore file in the root of your repository named `.sfdecomposerignore` to ignore specific XMLs when running the decompose command. The ignore file should follow [.gitignore spec 2.22.1](https://git-scm.com/docs/gitignore).

When the decompose command is ran with the `--debug` flag and it processes a file that matches an entry in `.sfdecomposerignore`, a warning will be printed to the `disassemble.log`:

Expand Down Expand Up @@ -353,14 +357,6 @@ Your VCS should also ignore the log created by the `xml-disassembler` package.
disassemble.log
```

## Contributing

Contributions are welcome! If you would like to contribute, please fork the repository, make your changes, and submit a pull request.

### Unique ID Elements

To add more unique ID elements for a metadata type, you can update the `src/metadata/uniqueIdElements.json` file. The metadata type's suffix should be used as the key.

## Issues

If you encounter any issues, please create an issue in the repository's [issue tracker](https://github.com/mcarvin8/sf-decomposer/issues). Please also create issues for feature enhancements or to support newer metadata types added to the [SDR toolkit](https://github.com/forcedotcom/source-deploy-retrieve).
Expand Down
9 changes: 7 additions & 2 deletions messages/decomposer.decompose.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,14 +4,15 @@ Decomposes the metadata files created by retrievals.

# description

This command will read the original metadata files and decompose them into smaller files. The decomposed file format can be XML, YAML, or JSON.
Decompose large metadata files into smaller files.

You should run this after you retrieve metadata from an org and before you commit the metadata to your git repository.
You should run this after you retrieve metadata from an org.

# examples

- `sf decomposer decompose -m "flow" -f "xml" --prepurge --postpurge --debug`
- `sf decomposer decompose -m "flow" -m "labels" -f "xml" --prepurge --postpurge --debug`
- `sf decomposer decompose -m "flow" -f "xml" -i "force-app"`

# flags.metadata-type.summary

Expand All @@ -32,3 +33,7 @@ If provided, debug to a log file.
# flags.format.summary

File format for the decomposed files.

# flags.ignore-package-directory.summary

Ignore a package directory.
7 changes: 6 additions & 1 deletion messages/decomposer.recompose.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,14 +4,15 @@ Recomposes the files created by the `decompose` command before deployments.

# description

This command will read the decomposed files and recreate deployment-compatible metadata files in each package directory.
Recompose the decomposed files into deployment-compatible metadata files.

You should run this before you deploy decomposed metadata to an org.

# examples

- `sf decomposer recompose -m "flow" -f "xml" --postpurge --debug`
- `sf decomposer recompose -m "flow" -m "labels" -f "xml" --postpurge --debug`
- `sf decomposer recompose -m "flow" -i "force-app"`

# flags.metadata-type.summary

Expand All @@ -28,3 +29,7 @@ If provided, debug to a log file.
# flags.format.summary

File format for the decomposed files.

# flags.ignore-package-directory.summary

Ignore a package directory.
9 changes: 8 additions & 1 deletion src/commands/decomposer/decompose.ts
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,12 @@ export default class DecomposerDecompose extends SfCommand<DecomposerResult> {
default: 'xml',
options: DECOMPOSED_FILE_TYPES,
}),
'ignore-package-directory': Flags.directory({
summary: messages.getMessage('flags.ignore-package-directory.summary'),
char: 'i',
required: false,
multiple: true,
}),
};

public async run(): Promise<DecomposerResult> {
Expand All @@ -57,8 +63,9 @@ export default class DecomposerDecompose extends SfCommand<DecomposerResult> {
const postpurge = flags['postpurge'];
const debug = flags['debug'];
const format = flags['format'];
const ignoreDirs = flags['ignore-package-directory'];
for (const metadataType of metadataTypes) {
const { metaAttributes, ignorePath } = await getRegistryValuesBySuffix(metadataType, 'decompose');
const { metaAttributes, ignorePath } = await getRegistryValuesBySuffix(metadataType, 'decompose', ignoreDirs);

const currentLogFile = await readOriginalLogFile(LOG_FILE);
await decomposeFileHandler(metaAttributes, prepurge, postpurge, debug, format, ignorePath);
Expand Down
9 changes: 8 additions & 1 deletion src/commands/decomposer/recompose.ts
Original file line number Diff line number Diff line change
Expand Up @@ -43,6 +43,12 @@ export default class DecomposerRecompose extends SfCommand<DecomposerResult> {
default: 'xml',
options: DECOMPOSED_FILE_TYPES,
}),
'ignore-package-directory': Flags.directory({
summary: messages.getMessage('flags.ignore-package-directory.summary'),
char: 'i',
required: false,
multiple: true,
}),
};

public async run(): Promise<DecomposerResult> {
Expand All @@ -51,8 +57,9 @@ export default class DecomposerRecompose extends SfCommand<DecomposerResult> {
const postpurge = flags['postpurge'];
const debug = flags['debug'];
const format = flags['format'];
const ignoreDirs = flags['ignore-package-directory'];
for (const metadataType of metadataTypes) {
const { metaAttributes } = await getRegistryValuesBySuffix(metadataType, 'recompose');
const { metaAttributes } = await getRegistryValuesBySuffix(metadataType, 'recompose', ignoreDirs);

const currentLogFile = await readOriginalLogFile(LOG_FILE);
await recomposeFileHandler(metaAttributes, postpurge, debug, format);
Expand Down
9 changes: 7 additions & 2 deletions src/metadata/getPackageDirectories.ts
Original file line number Diff line number Diff line change
@@ -1,15 +1,16 @@
'use strict';
/* eslint-disable no-await-in-loop */

import { resolve, join } from 'node:path';
import { resolve, join, basename } from 'node:path';
import { readFile, readdir, stat } from 'node:fs/promises';

import { getRepoRoot } from '../service/getRepoRoot.js';
import { SfdxProject } from '../helpers/types.js';
import { IGNORE_FILE } from '../helpers/constants.js';

export async function getPackageDirectories(
metaDirectory: string
metaDirectory: string,
ignoreDirs: string[] | undefined
): Promise<{ metadataPaths: string[]; ignorePath: string }> {
const { repoRoot, dxConfigFilePath } = await getRepoRoot();
if (!repoRoot || !dxConfigFilePath) {
Expand All @@ -19,9 +20,13 @@ export async function getPackageDirectories(
const ignorePath = resolve(repoRoot, IGNORE_FILE);
const sfdxProjectRaw: string = await readFile(dxConfigFilePath, 'utf-8');
const sfdxProject: SfdxProject = JSON.parse(sfdxProjectRaw) as SfdxProject;
const normalizedIgnoreDirs = (ignoreDirs ?? []).map((dir) => basename(dir));
const packageDirectories = sfdxProject.packageDirectories.map((directory) => resolve(repoRoot, directory.path));
const metadataPaths: string[] = [];
for (const directory of packageDirectories) {
if (normalizedIgnoreDirs.includes(basename(directory))) {
continue;
}
const filePath: string | undefined = await searchRecursively(directory, metaDirectory);
if (filePath !== undefined) {
metadataPaths.push(resolve(filePath));
Expand Down
Loading

0 comments on commit 2425eee

Please sign in to comment.