

# Custom
<a name="custom"></a>

## What is AWS Transform custom?
<a name="custom-what-is"></a>

AWS Transform custom uses agentic AI to perform large-scale modernization of software, code, libraries, and frameworks to reduce technical debt. It handles diverse scenarios including language version upgrades, API and service migrations, framework upgrades and migrations, code refactoring, and organization-specific transformations.

Through continual learning, the agent improves from every execution and developer feedback, delivering high-quality, repeatable transformations without requiring specialized automation expertise.

### Key Capabilities
<a name="custom-key-capabilities"></a>

AWS Transform custom provides the following capabilities:
+ **Natural language-driven transformation definition** - Create custom transformations using natural language, documentation, and code samples
+ **Transformation execution** - Apply transformations consistently and reliably across multiple codebases
+ **Continual learning** - Automatically improve transformation quality from every execution
+ **AWS-managed transformations** - Use ready-to-use, AWS-vetted transformations for common scenarios

### Transformation Patterns
<a name="custom-transformation-patterns"></a>

AWS Transform custom supports diverse transformation patterns to address your modernization needs. Each pattern has different complexity characteristics based on the scope and nature of the changes required.


| Pattern | Description | Complexity | Examples | 
| --- | --- | --- | --- | 
| API and Service Migrations | Migrating between API versions or equivalent services while maintaining functionality | Medium | AWS SDK v1→v2 (Java, Python, JavaScript), Boto2→Boto3, JUnit 4→5, javax→jakarta | 
| Language Version Upgrades | Upgrading to newer versions of the same programming language, adopting new features and replacing deprecated functionality | Low-Medium | Java 8→17, Python 3.9→3.13, Node.js 12→22, TypeScript version upgrades | 
| Framework Upgrades | Upgrading to newer versions of the same framework, addressing breaking changes | Medium | Spring Boot 2.x→3.x, React 17→18, Angular upgrades, Django upgrades | 
| Framework Migrations | Migrating to entirely different frameworks that serve similar purposes | High | Angular→React, Redux→Zustand, Vue.js→React | 
| Library and Dependency Upgrades | Upgrading third-party libraries to newer versions while maintaining the same language and framework | Low-Medium | Pandas 1.x→2.x, NumPy upgrades, Hadoop/HBase/Hive library upgrades, Lodash upgrades | 
| Code Refactoring and Pattern Modernization | Modernizing code patterns and adopting best practices without changing external functionality | Low-Medium | Print→Logging frameworks, string concatenation→f-strings, type hints adoption, observability instrumentation | 
| Script and File-by-File Translations | Translating independent scripts or configuration files where files are mostly self-contained | Low-Medium | AWS CDK→Terraform, Terraform→CloudFormation, Excel→Python notebooks, Bash→PowerShell | 
| Architecture Migrations | Migrating between hardware architectures or runtime environments with minimal code changes | Medium-High | x86→AWS Graviton (ARM), on-premises→Lambda, traditional server→containers | 
| Language-to-Language Migrations | Translating codebases from one programming language to another | Very High | Java→Python, JavaScript→TypeScript, C→Rust, Python→Go | 
| Custom and Organization-Specific Transformations | Unique organizational requirements and specialized modernization needs | Varies | Custom internal library migrations, organization-specific coding standards, proprietary framework migrations | 

**Note**  
For COBOL/mainframe languages, use AWS Transform for Mainframe. For .NET Framework upgrades to .NET Core, consider AWS Transform for Windows. For VMware migrations to AWS, consider AWS Transform for VMware.

### How AWS Transform custom Works
<a name="custom-how-it-works"></a>

AWS Transform custom is usually used in large-scale projects where multiple codebases or modules are transformed. Teams typically follow a four-phase workflow:

**Define Transformation** - Provide natural language prompts, documentation, and code samples to the agent, which generates an initial transformation definition. This definition can be iteratively refined through chat or direct edits. This phase can be skipped when using AWS-managed transformations.

**Pilot or Proof-of-Concept** - Test the transformation on sample codebases and refine based on results. This validation phase helps estimate the cost and effort of the full transformation. Continual learning improves quality during this phase.

**Scaled Execution** - Set up automated bulk execution using the CLI, with developers reviewing and validating results. Monitor progress using the web application and track transformations across multiple repositories.

**Monitor and Review** - Continual learning automatically improves the transformation quality. Review and approve knowledge items that have been extracted from executions to ensure they meet quality standards.

## Understanding Key Concepts
<a name="custom-understanding-key-concepts"></a>

This section explains key concepts for working with AWS Transform custom.

### Transformation Definitions
<a name="custom-transformation-definitions"></a>

A **transformation definition** is a package that contains the instructions and knowledge needed to perform a specific code transformation. It consists of:
+ `transformation_definition.md` (required) - Contains the core transformation logic and instructions
+ `summaries.md` (optional) - Summaries for user-provided reference documentation
+ `document_references/` folder (optional) - User-provided documentation and reference materials

AWS Transform CLI automatically downloads transformation definitions to the current directory when needed for execution, inspection, or modification.

**Important**  
When publishing a transformation, the directory must only contain these files and folders. No other files or subdirectories are allowed.

### Transformation Registry
<a name="custom-transformation-registry"></a>

The **transformation registry** is your AWS account's centralized repository for storing and managing transformation definitions. Transformations in the registry can be:
+ Listed using `atx custom def list`
+ Executed across multiple codebases
+ Shared with other users in your AWS account
+ Version-controlled

**Important**  
Transformation definitions are account-specific. If you want to use a transformation in a different AWS account, you must publish it separately in that account.

### Draft vs Published Transformations
<a name="custom-draft-vs-published"></a>

AWS Transform custom supports two states for transformations in the registry:

**Draft transformations** are in-progress or untested transformation definitions. They are saved as specific versions and can be retrieved, updated, and executed by users referring to that specific version. Drafts are useful for iterative development, testing, and refinement before the transformation is ready to be shared with your team. Drafts are also associated to specific conversation. If you restart the CLI, you can use `atx --conversation-id {id}` to restore a previous one.

**Published transformations** are available in your account's transformation registry for execution by other users with the required IAM permissions. Published transformations can be discovered using `atx custom def list`.

The typical workflow is:

1. Create transformation locally

1. Save as draft for testing (`atx custom def save-draft`)

1. Refine and validate

1. Publish to share with your team (`atx custom def publish`)

You can also publish a transformation directly without saving it as a draft.

### References vs. Knowledge Items
<a name="custom-references-vs-knowledge-items"></a>

AWS Transform custom uses two types of knowledge to improve transformation quality:

**References** are user-provided documentation that you explicitly add to a transformation definition. References support text files only (maximum 10MB total for all files) and usually contain documentation, API specifications, migration guides, and code samples. You add references when creating or updating a transformation definition in interactive mode.

**Knowledge items** are automatically extracted learnings from transformation executions. These are created asynchronously by the continual learning system based on execution trajectories, developer feedback, and code fixes encountered during transformations. Knowledge items start in a "not approved" state and must be explicitly approved by transformation owners before they can be used in future executions. Unlike references which you provide upfront, knowledge items accumulate over time as the transformation is executed across different codebases.

### Build and Validation Commands
<a name="custom-build-validation-commands-concept"></a>

The **build or validation command** is an optional parameter that specifies how to validate your code during the transformation process. This command is executed at various points during the transformation to ensure code integrity.

Providing a command that validates the results and returns issues if validation fails is very important to improve transformation quality through continual learning. If no build or validation is needed, omit from your input.

For examples and detailed guidance, see Build and Validation Commands in the Workflows section.

### Continual Learning
<a name="custom-continual-learning-concept"></a>

**Continual learning** is the system that automatically captures feedback from every transformation execution and improves transformation quality over time. The system gathers information through:
+ Explicit feedback - Comments and code fixes provided in interactive mode
+ Implicit observations - Issues the agent encounters while transforming and debugging code

This information is processed to create **knowledge items** that are added to the transformation definition to improve future transformations. The learning process occurs automatically after transformations are completed, requiring no additional user input.

**Important**  
Knowledge items are transformation-specific and are not shared across different transformations or different customer accounts.

# Getting Started
<a name="custom-get-started"></a>

This section describes how to set up AWS Transform custom and run your first transformation.

## Prerequisites
<a name="custom-prerequisites"></a>

Before installing AWS Transform custom, ensure you have the following:

### Supported Platforms
<a name="custom-supported-platforms"></a>

AWS Transform custom supports the following operating systems:
+ **Linux** - Full support for all Linux distributions
+ **macOS** - Full support for macOS systems
+ **Windows Subsystem for Linux (WSL)** - Supported when running under WSL

**Important**  
Windows native execution is not supported. AWS Transform custom will detect native Windows environments and exit with an error message. For Windows users, install and use Windows Subsystem for Linux (WSL).

### Required Software
<a name="custom-required-software"></a>
+ **Node.js 20 or later** - Download from https://nodejs.org/en/download
+ **Git** - Must be installed and the working directory needs to be a valid Git repo to execute a transformation. Run `git init; git add .; git commit -m "Initial commit"` to initialize a Git repo in your working directory.

### Network Requirements
<a name="custom-network-requirements"></a>

An internet connection with access to the following endpoints is required:
+ `transform-cli.awsstatic.com`
+ `transform-custom.<region>.api.aws`
+ `*.s3.amazonaws.com`

If you are working in an internet-restricted environment, update firewall rules to allowlist these URLs.

## Installing the AWS Transform CLI
<a name="custom-installation"></a>

The recommended installation method is using the installation script.

**To install the AWS Transform CLI**

1. Run the installation script:

   ```
   curl -fsSL https://transform-cli.awsstatic.com/install.sh | bash
   ```

1. Verify the installation:

   ```
   atx --version
   ```

   The command should display the installed version of the AWS Transform CLI.

## Configuring Authentication
<a name="custom-authentication"></a>

AWS Transform custom requires AWS credentials to authenticate with the service. Configure authentication using one of the following methods.

### Environment Variables
<a name="custom-environment-variables"></a>

Set the following environment variables:

```
export AWS_ACCESS_KEY_ID=your_access_key
export AWS_SECRET_ACCESS_KEY=your_secret_key
export AWS_SESSION_TOKEN=your_session_token
```

You can also specify a profile using the `AWS_PROFILE` environment variable:

```
export AWS_PROFILE=your_profile_name
```

### AWS Credentials File
<a name="custom-credentials-file"></a>

Configure credentials in `~/.aws/credentials`:

```
[default]
aws_access_key_id = your_access_key
aws_secret_access_key = your_secret_key
```

### IAM Permissions
<a name="custom-iam-permissions"></a>

Your AWS credentials must have permissions to call the AWS Transform custom service. We recommend attaching the [`AWSTransformCustomFullAccess`](security-iam-awsmanpol.md#security-iam-awsmanpol-AWSTransformCustomFullAccess) AWS managed policy to your IAM user or role. This policy provides full access to AWS Transform custom, including the permission to create the [service-linked role](using-service-linked-roles.md#using-service-linked-roles-custom) required for CloudWatch metrics emission to your account.

For more granular control, AWS Transform custom also provides the following AWS managed policies:
+ [`AWSTransformCustomExecuteTransformations`](security-iam-awsmanpol.md#security-iam-awsmanpol-AWSTransformCustomExecuteTransformations) – Provides access to execute transformations.
+ [`AWSTransformCustomManageTransformations`](security-iam-awsmanpol.md#security-iam-awsmanpol-AWSTransformCustomManageTransformations) – Provides access to create, update, read, and delete transformation resources, as well as execute transformations.

For more information about these policies, see [AWS managed policies for AWS Transform](security-iam-awsmanpol.md). For custom IAM policies with resource-level permissions, refer to the [AWS Transform Custom IAM Service Authorization Reference Guide](https://docs.aws.amazon.com/service-authorization/latest/reference/list_awstransformcustom.html).

**Note**  
Transformation definitions are AWS resources with ARNs (Amazon Resource Names). You can use IAM policies to control access to specific transformations or groups of transformations by specifying the transformation ARN in your IAM policy resource statements. Tags can be used for grouped access control.
AWS IAM Identity Center is required to access the AWS Transform, but is not required to use the CLI.

## Configuring AWS Region
<a name="custom-region-configuration"></a>

AWS Transform custom is available in specific AWS regions and automatically detects your region configuration using standard AWS CLI precedence.

### Supported Regions
<a name="custom-supported-regions"></a>

AWS Transform custom is available in the following AWS regions:
+ `us-east-1` (US East - N. Virginia)
+ `eu-central-1` (Europe - Frankfurt)

### How Region is Determined
<a name="custom-region-priority"></a>

The CLI checks these sources in priority order to determine which region to use:

1. `AWS_REGION` environment variable (highest priority)

1. `AWS_DEFAULT_REGION` environment variable

1. Selected profile in `~/.aws/config` (via `AWS_PROFILE` environment variable)

1. Default profile in `~/.aws/config`

1. Default fallback to `us-east-1` if no configuration is found

**Note**  
If `ATX_CUSTOM_ENDPOINT` is set, the region is extracted from the endpoint URL and overrides all other settings.

### Setting Your Region
<a name="custom-region-examples"></a>

Choose one of these methods to configure your region:

**Option 1: Environment variable (recommended for temporary use):**

```
export AWS_REGION=<your-region>
```

**Option 2: AWS CLI configuration (recommended for permanent setup):**

```
aws configure set region <your-region>
```

**Option 3: Profile-specific configuration:**

```
aws configure set region <your-region> --profile your_profile_name
export AWS_PROFILE=your_profile_name
```

**Option 4: Direct file editing:**

Edit `~/.aws/config` directly:

```
[default]
region = <your-region>

[profile your_profile_name]
region = <your-region>
```

### Verifying Your Region Configuration
<a name="custom-debug-region-resolution"></a>

To check which region AWS Transform custom will use:

**Check current region setting:**

```
aws configure get region
```

**Check environment variables:**

```
echo "AWS_REGION: $AWS_REGION"
echo "AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION"
echo "AWS_PROFILE: $AWS_PROFILE"
```

**View detailed region resolution in debug logs:**

```
tail -f ~/.aws/atx/logs/debug.log
```

Example log output showing region source:

```
2026-01-07 22:34:28 [DEBUG]: Initializing FrontendServiceClient with config:
{
  "endpoint": "https://transform-custom.us-east-1.api.aws",
  "region": "us-east-1",
  "regionSource": "AWS_REGION"
}
```

The `regionSource` field shows where the region was derived from (e.g., "aws-config (profile: default)", "AWS\$1REGION", "default").

**Important**  
If your region configuration points to an unsupported region, the CLI will display a clear error message with instructions to update your configuration to a supported region.

## Running Your First Transformation
<a name="custom-first-transformation"></a>

The fastest way to get started is to use an AWS-managed transformation. AWS-managed transformations are pre-built, AWS-vetted transformations that are ready to use without any setup.

**To run an AWS-managed transformation interactively:**

```
atx
```

Then ask the agent to execute a transformation:

```
Execute the AWS/java-aws-sdk-v1-to-v2 transformation on the codebase at ./my-java-project
```

**To run a transformation non-interactively:**

```
atx custom def exec -n AWS/java-aws-sdk-v1-to-v2 -p ./my-java-project -c "mvn clean install" -x -t
```

For detailed information about execution modes, configuration options, and command flags, see [Command Reference](custom-command-reference.md).

## Understanding Transformation Results
<a name="custom-understanding-results"></a>

When a transformation completes, AWS Transform custom provides a report describing the results. If the transformation was not validated (e.g. successful build for a Java applications), the report provides recommendations for further actions.

Most transformations are performed in multiple steps, using Git to commit intermediate ones. You can review the changes made by the transformation using standard Git commands:

```
git status
git log
git diff <original commit-id>
```

## Creating a Custom Transformation
<a name="custom-creating-custom"></a>

When AWS-managed transformations don't meet your needs, you can create custom transformations tailored to your specific requirements.

**To create a custom transformation**

1. Start the AWS Transform CLI in interactive mode:

   ```
   atx
   ```

1. Tell the agent you want to create a new transformation and describe your requirements.

1. Provide reference materials such as documentation and code samples to improve quality.

1. Test and refine the transformation iteratively.

1. Publish it to your organization's transformation registry.

For detailed instructions on creating custom transformations, see [Workflows](custom-workflows.md).

## AWS Transform Web Application (Optional)
<a name="custom-web-application"></a>

The AWS Transform web application is an optional interface for monitoring large-scale transformation campaigns across multiple repositories. Before using AWS Transform web application, you need to have a user identity enabled to access AWS Transform in your organization. For information about enabling AWS Transform, see [Setting up AWS Transform](https://docs.aws.amazon.com/transform/latest/userguide/transform-setup.html).

**To use the AWS Transform web application:**

1. Visit https://aws.amazon.com/transform/ and sign in using AWS IAM Identity Center credentials.

1. Select the "Custom / code" job type in the AWS Transform web application.

1. Tell the agent the name of the transformation definition you would like to use.

1. The web application provides an AWS Transform CLI command that can be shared with your team. This command invokes the transformation definition and logs execution results to the web application.

1. After transformation results are captured, view statistics, time savings measurements, and ask questions about the progress of your job in the Dashboard.

The web application is designed for enterprise-scale operations where you need centralized monitoring of transformations across multiple codebases.

**Note**  
AWS IAM Identity Center is required to access the AWS Transform web application. The CLI does not require IAM Identity Center - only standard AWS credentials are needed.

## Introduction to custom transformation commands
<a name="custom-cmd-intro"></a>

Here are a few of the commands you can use with custom transformations. The complete list of commands is in the [AWS Transform custom transformations command reference](https://docs.aws.amazon.com/transform/latest/userguide/custom-command-reference.html).
+ `atx custom`
  + **Executes custom transformation interactive experience, allowing creation, discovery, execution, and refinement of transformations.**
  + This is the default command for `atx`.
  + `--trust-all-tools` (`-t`) is optional and implicitly allows all requested tool requests by the agent. Use with caution especially in production environments. You can configure tool trust for specific tools and commands using the [trust settings file](https://docs.aws.amazon.com/transform/latest/userguide/custom-workflows.html#custom-advanced-configuration).
+ `atx custom --help` \$1 `atx custom -h`
  + **Shows help menu.**
  + Each command also includes a help menu, for example, `atx custom def exec --help`.
+ `atx --version` \$1 `atx -v`
  + **Shows version.**
  + The version number changes with each release.
+ `atx custom def list`
  + **Prints list of transformations available in the transformation registry.**
+ `atx custom def exec`
  + **Executes a transformation**
+ `atx mcp`
  + **Used to manage MCP server configurations**

# Workflows
<a name="custom-workflows"></a>

## Executing Transformations
<a name="custom-executing-transformations"></a>

This section describes the different ways to execute transformations and options for controlling execution behavior.

### Execution Modes
<a name="custom-execution-modes"></a>

AWS Transform custom supports three execution modes to accommodate different workflows.

**Interactive Conversational Mode**

Start the CLI with `atx` and ask the agent to execute a transformation through natural language. This mode allows you to have a full conversation with the agent, interrupt execution at any point, and provide feedback during the transformation process.

Use this mode when you want maximum control and the ability to guide the agent through complex scenarios.

**Direct Interactive Execution**

Use `atx custom def exec -n <transformation-name> -p <path>` to start a specific transformation interactively. This mode allows you to review and interact with the agent at the beginning, during, or at the end of execution. The agent will pause at key decision points and ask for your input.

This is ideal for testing and refining transformations before running them autonomously.

**Non-Interactive/Headless Mode**

Use `atx custom def exec -n <transformation-name> -p <path> -x -t` for full automation. The `-x` flag enables non-interactive mode, and the `-t` flag automatically trusts all tools without prompting.

This mode is designed for CI/CD pipeline integration and bulk execution where no human intervention is available or desired.

### Common Command Flags
<a name="custom-common-command-flags"></a>

When executing transformations with `atx custom def exec`, the following flags are commonly used:
+ `-n` or `--transformation-name` - Specifies the name of the transformation to execute
+ `-p` or `--code-repository-path` - Specifies the path to your codebase (use "." for current directory)
+ `-c` or `--build-command` - Specifies the build or validation command to run
+ `-x` or `--non-interactive` - Enables non-interactive mode (no user prompts)
+ `-t` or `--trust-all-tools` - Automatically trusts all tools without prompting
+ `-d` or `--do-not-learn` - Prevents knowledge item extraction from this execution
+ `--tv` or `--transformation-version` - Specifies a specific version of the transformation
+ `-g` or `--configuration` - Provides a configuration file or inline configuration

**Important**  
The `-t` or `--trust-all-tools` flag automatically approves all tool executions without prompting and bypasses all security guardrails. This is required for a fully autonomous experience but not required to execute the transformation. Use with caution in production environments.

### Using Configuration Files
<a name="custom-using-configuration-files"></a>

AWS Transform custom supports optional configuration files in YAML or JSON format. Configuration files allow you to specify execution parameters and provide additional context to the agent.

**To use a configuration file:**

```
atx custom def exec --configuration file://config.yaml
```

You can also provide configuration as inline key-value pairs:

```
atx custom def exec --configuration "key=value,key2=value2"
```

**Example configuration file (config.yaml):**

```
codeRepositoryPath: ./my-project
transformationName: my-transformation
buildCommand: mvn clean install
additionalPlanContext: |
  The target Java version to upgrade to is Java 17.
  Ensure compatibility with our internal logging framework version 2.3.
validationCommands: |
  mvn test
  mvn verify
```

The `additionalPlanContext` parameter provides extra context for the agent's execution plan. This is especially useful with AWS-managed transformations to customize their behavior for your specific needs.

### Build and Validation Commands
<a name="custom-build-validation-commands"></a>

The build or validation command is an optional parameter that specifies how to validate your code during the transformation process. AWS Transform custom will attempt to infer the best build command based on the transformation if not specified, though it is recommended to be specific for quality.

**Examples of build and validation commands:**
+ Java: `mvn clean install` or `gradle build`
+ Python: `pytest` or `python -m py_compile`
+ Node.js: `npm run build` or `npm test`
+ Linters: `eslint .` or `pylint .`

Even for languages or transformations that don't require building, providing a command that validates the results and returns issues if validation fails is very important to improve transformation quality.

If no build or validation is needed, omit from your input.

### Controlling Learning Behavior
<a name="custom-controlling-learning-behavior"></a>

By default, AWS Transform custom extracts knowledge items from every transformation execution. You can prevent learning for specific executions.

**To prevent learning from an execution:**

```
atx custom def exec -n my-transformation -p ./my-project -d
```

The `-d` or `--do-not-learn` flag opts out of allowing knowledge item extraction from the current execution.

### Resuming Conversations
<a name="custom-resuming-conversations"></a>

AWS Transform custom allows you to resume previous conversations within 30 days of creation.

**To resume the most recent conversation:**

```
atx --resume
```

**To resume a specific conversation:**

```
atx --conversation-id <conversation-id>
```

**Important**  
Conversations can only be resumed within 30 days of creation. After 30 days, the conversation can no longer be resumed.

### Tracking Agent Minutes
<a name="custom-tracking-agent-minutes"></a>

AWS Transform custom tracks the [agent minutes](https://aws.amazon.com/transform/pricing/) consumed during a transformation session. Agent minutes accumulate throughout the conversation lifecycle and are displayed when the conversation ends:

```
Agent minutes used: 12.50
```

Agent minutes persist across interruptions. If you interrupt a session with Ctrl\$1C and resume it later, the previously accumulated minutes carry over and continue accumulating in the resumed session.

**To check Agent Minutes during an interactive session:**

Type `/usage` at the input prompt to display the current accumulated Agent Minutes without ending the conversation.

**To set an Agent Minutes budget limit:**

```
atx custom def exec -n my-transformation -p ./my-project --limit 30
```

The `--limit` option sets a maximum [Agent Minutes](https://aws.amazon.com/transform/pricing/) budget for the session. Agent Minutes reflect active agent work time, not wall clock time. When the limit is reached, the CLI displays a message and exits with instructions to resume:

```
⚠️ Budget limit reached: 30.00 / 30.00 Agent Minutes. Exiting.
```

You can resume the conversation later with an increased limit:

```
atx --conversation-id <conversation_id> -t --limit <increased_limit>
```

## Continual Learning
<a name="custom-continual-learning"></a>

This section describes how to manage knowledge items created by continual learning.

### Understanding Knowledge Items
<a name="custom-understanding-knowledge-items"></a>

Knowledge items are automatically extracted learnings from transformation executions. These are created asynchronously by the continual learning system based on:
+ Developer feedback provided in interactive mode
+ Code issues encountered during transformations

Knowledge items start in a "not approved" state and must be explicitly approved by transformation owners before they can be used in future executions. Unlike references which you provide upfront, knowledge items accumulate over time as the transformation is executed across different codebases.

### Listing Knowledge Items
<a name="custom-listing-knowledge-items"></a>

View all knowledge items for a transformation definition.

**To list knowledge items:**

```
atx custom def list-ki -n my-transformation
```

This displays all knowledge items that have been extracted from executions of the specified transformation definition.

### Viewing Knowledge Item Details
<a name="custom-viewing-knowledge-item-details"></a>

View detailed information about a specific knowledge item.

**To view knowledge item details:**

```
atx custom def get-ki -n my-transformation --id <knowledge-item-id>
```

### Enabling and Disabling Knowledge Items
<a name="custom-enabling-disabling-knowledge-items"></a>

Control which knowledge items are applied to future transformations.

**To enable a knowledge item:**

```
atx custom def update-ki-status -n my-transformation --id <knowledge-item-id> --status ENABLED
```

**To disable a knowledge item:**

```
atx custom def update-ki-status -n my-transformation --id <knowledge-item-id> --status DISABLED
```

### Configuring Auto-Approval
<a name="custom-configuring-auto-approval"></a>

You can configure whether knowledge items are automatically enabled or require manual approval.

**To enable auto-approval for all knowledge items:**

```
atx custom def update-ki-config -n my-transformation --auto-enabled TRUE
```

**To disable auto-approval:**

```
atx custom def update-ki-config -n my-transformation --auto-enabled FALSE
```

### Deleting Knowledge Items
<a name="custom-deleting-knowledge-items"></a>

Permanently remove knowledge items that are not useful.

**To delete a knowledge item:**

```
atx custom def delete-ki -n my-transformation --id <knowledge-item-id>
```

### Exporting Knowledge Items
<a name="custom-exporting-knowledge-items"></a>

Export all knowledge items for a transformation to markdown format for review or documentation.

**To export knowledge items:**

```
atx custom def export-ki-markdown -n my-transformation
```

## Advanced Configuration
<a name="custom-advanced-configuration"></a>

This section describes advanced features and configuration options for AWS Transform custom.

### Environment Variables
<a name="custom-environment-variables-config"></a>

You can customize CLI behavior using environment variables.

**ATX\$1SHELL\$1TIMEOUT**

Override the default timeout for shell commands (900 seconds/15 minutes).

```
export ATX_SHELL_TIMEOUT=1800  # 30 minutes
```

This is useful for large codebases or long-running build processes.

**ATX\$1DISABLE\$1UPDATE\$1CHECK**

Disable automatic version checks and update notifications during command execution.

```
export ATX_DISABLE_UPDATE_CHECK=true
```

### Trust Settings
<a name="custom-trust-settings"></a>

Trust settings allow you to pre-approve specific tools and commands to execute without prompts. These settings are configured in the `~/.aws/atx/trust-settings.yaml` file.

The file contains two lists:
+ `trustedTools` - Tools that can execute without prompting
+ `trustedShellCommands` - Shell commands that can execute without prompting

**Default trusted tools:**
+ `file_read`
+ `get_transformation_from_registry`
+ `list_available_transformations_from_registry`

**Editing trust settings:**

You can manually edit the trust-settings.yaml file to add or remove trusted tools and commands. The `trustedShellCommands` list supports wildcard patterns using `*` for flexible command matching.

Examples:
+ `cd *` - Matches compound commands starting with cd
+ `*&&*` - Trusts all commands with && operators

**Session-level trust:**

During interactive prompts, you can choose:
+ `(y)es` - Execute once
+ `(n)o` - Deny
+ `(t)rust` - Trust for the current session only

Session-level trust settings are temporary and reset when the CLI restarts, providing temporary approval without permanently modifying trust-settings.yaml.

### Model Context Protocol (MCP) Servers
<a name="custom-mcp-servers"></a>

The AWS Transform CLI supports Model Context Protocol (MCP) servers, which extend its functionality with additional tools.

**Configuration:**

Configure MCP servers in the `~/.aws/atx/mcp.json` file.

**Managing MCP servers:**

View list of configured MCP servers:

```
atx mcp tools
```

 List available tools offered by a specific MCP server: 

```
atx mcp tools --server <server-name>
```

**Usage tracking:**

The CLI automatically tracks MCP tool usage during transformation executions. Usage statistics are persisted as `mcp_usage.json` in the conversation directory alongside `metadata.json`. The file records per-tool metrics for each execution, including:
+ Number of invocations per tool
+ Number of errors per tool
+ Total execution time per tool
+ Last error details (if any)

### Tags and Organization
<a name="custom-tags-organization"></a>

You can organize transformations with tags for access control and categorization.

**Note**  
Some of these commands require specifying the Amazon Resource Name (ARN) for a Transformation Definition. The ARN structure is: `arn:aws:transform-custom:<region>:<account-id>:package/<td-name>`

**To list tags for a transformation:**

```
atx custom def list-tags --arn <transformation-arn>
```

**To add tags to a transformation:**

```
atx custom def tag --arn <transformation-arn> --tags '{"env":"prod","team":"backend"}'
```

**To remove tags from a transformation:**

```
atx custom def untag --arn <transformation-arn> --tag-keys "env,team"
```

Tags can be used for grouped access control in IAM policies. You can create policies that grant permissions to all transformations with specific tags (e.g., all transformations tagged with `team:frontend` or `environment:production`).

### Logs
<a name="custom-logs-config"></a>

AWS Transform CLI maintains two types of logs for troubleshooting and debugging.

**Conversation logs:**

Location: `~/.aws/atx/custom/<conversation_id>/logs/<timestamp>-conversation.log`

These logs contain the full conversation history for a specific session.

**Developer debug logs:**

Locations:
+ `~/.aws/atx/logs/debug*.log`
+ `~/.aws/atx/logs/error.log `

These logs provide advanced troubleshooting information for the CLI itself.

**Note**  
There may be multiple debug log files in the logs directory (i.e. debug1.log, debug2.log). Review and provide all relevant logs for example, \$1/.aws/atx/custom/<conversation-id>/\$1 and \$1/.aws/atx/logs/\$1, when opening support tickets for faster resolution.

### CLI Updates
<a name="custom-cli-updates"></a>

Keep your CLI up to date to access new features and improvements.

**To check for updates:**

```
atx update --check
```

**To update to the latest version:**

```
atx update
```

**To update to a specific version:**

```
atx update --target-version <version>
```

## Create Custom Transformations
<a name="custom-create-custom-transformations"></a>

This section describes how to create, modify, and manage custom transformation definitions.

### Creating a New Transformation
<a name="custom-creating-new-transformation"></a>

Use the interactive CLI to create a new transformation definition.

**To create a transformation definition**

1. Start the AWS Transform CLI:

   ```
   atx
   ```

1. Tell the agent you want to create a new transformation.

1. Provide a clear, detailed description of the transformation objective. Include:
   + The source and target state (e.g., "upgrade from version X to version Y")
   + Specific changes required (e.g., "update import statements, replace deprecated methods")
   + Any special considerations or constraints

1. When the agent requests clarification or additional information, provide specific examples and reference materials.

1. Review the initial transformation definition created by the agent.

1. Test the transformation on a sample codebase.

1. Iterate by providing feedback, code fixes, or additional examples.

1. Save the transformation locally or publish it to the registry.

**Best practices for creating transformations:**
+ Start with simple, well-defined transformations before attempting complex ones
+ Provide comprehensive reference materials including migration guides and code samples
+ Test on multiple sample codebases before publishing
+ Use deterministic build or validation commands to enable continual learning
+ Consider breaking complex transformations into multiple smaller steps
+ Mark crucial information with "CRITICAL:" or "IMPORTANT:" in your transformation definitions to ensure the agent prioritizes these requirements
+ When you need exact requirements followed (like using a specific command or string value), explicitly specify the complete string in your transformation definitions. You can wrap these in bash quotes to clearly indicate they're terminal commands or literal strings, which reduces variability and ensures consistent execution

### Providing Reference Materials
<a name="custom-providing-reference-materials"></a>

You can provide reference files to AWS Transform custom by specifying file paths during the conversation.

Recommended types of reference files:
+ Before/after example code
+ Documentation for APIs, libraries, or features involved
+ Human-readable migration guides

**To provide a reference file:**

```
Take a look at the documentation here: /path/to/migration-guide.md
```

You can also provide a directory containing multiple reference files:

```
Take a look at the docs we have here: /path/to/docs/
```

**Note**  
Only text-based files (.md, .html, .txt, code files) are supported. Binary files, images, and rich text files (e.g., .pdf, .png, .docx) are not currently supported. It is often possible to extract the text content and use that as reference. If you have many small text files, consider concatenating them into few descriptively-named files. There is a limit of 10MB total for all files.

### Modifying an Existing Transformation
<a name="custom-modifying-existing-transformation"></a>

You can modify custom transformations both before and after saving them as drafts or publishing them. You cannot modify AWS-managed transformations. If you need to customize them, you can provide additional context using the config file.

**To modify an existing transformation**

1. Start the AWS Transform CLI:

   ```
   atx
   ```

1. Tell the agent you want to modify an existing transformation.

1. Choose whether to:
   + Provide a file path to a locally stored transformation (i.e. not a saved draft or published)
   + Request the list of transformations from the registry

1. If choosing from the registry, select the transformation you want to modify.

1. Work with the agent to describe the changes you want to make.

1. Test the updated transformation on a sample codebase.

1. Publish your updates to the registry if desired.

### Publishing and Managing Transformations
<a name="custom-publishing-managing-transformations"></a>

You can publish and manage your transformations using the interactive experience or with the following commands.

**To save a transformation as a draft:**

```
atx custom def save-draft -n my-transformation --description "Description of the transformation" --sd ./transformation-directory
```

**To publish a transformation:**

```
atx custom def publish -n my-transformation --description "Description of the transformation" --sd ./transformation-directory
```

**To list available transformations:**

```
atx custom def list
```

**To download a transformation definition:**

```
atx custom def get -n my-transformation
```

This downloads the transformation definition to your current working directory. You can specify a target directory with the `--td` flag and a version with the `--tv` flag.

**To delete a transformation definition:**

```
atx custom def delete -n my-transformation
```

**Important**  
This permanently deletes the specified transformation definition from your account.

### Managing Transformation Versions
<a name="custom-managing-transformation-versions"></a>

AWS Transform custom maintains versions of your transformation definitions. You can specify a version when executing or downloading a transformation.

**To execute a specific version:**

```
atx custom def exec -n my-transformation --tv v1 -p ./my-project
```

**To download a specific version:**

```
atx custom def get -n my-transformation --tv v1
```

If no version is specified, the latest version is used.

# Command Reference
<a name="custom-command-reference"></a>

This section provides a comprehensive reference of all AWS Transform custom CLI commands.

## Interactive Commands
<a name="custom-interactive-commands"></a>

**atx**

Start an interactive conversation with the AWS Transform CLI.

```
atx                           # Start new conversation
atx --resume                  # Resume most recent conversation
atx --conversation-id <id>    # Resume specific conversation
atx -t                        # Start with all tools trusted
```

**Interactive session commands**

The following commands can be typed at the input prompt during an interactive session:
+ `/usage` - Display the accumulated [agent minutes](https://aws.amazon.com/transform/pricing/) for the current session

## Transformation Definition Commands
<a name="custom-transformation-definition-commands"></a>

**atx custom def exec**

Execute a transformation definition on a code repository.


| Option | Long Form | Parameter | Description | 
| --- | --- | --- | --- | 
| -p | --code-repository-path | <path> | Path to the code repository to transform. For current directory, use "." | 
| -c | --build-command | <command> | Command to run when building repository | 
| -n | --transformation-name | <name> | Name of the transformation definition in the registry | 
| -x | --non-interactive | - | Runs the transformation with no user assistance | 
| -t | --trust-all-tools | - | Trusts all tools (no tool prompts) | 
| -d | --do-not-learn | - | Opt out of allowing knowledge item extraction from this execution | 
| -g | --configuration | <config> | Path to config file (JSON or YAML) or key=value pairs | 
| --tv | --transformation-version | <version> | Version of the transformation definition to use | 
| --limit | --limit | <limit> | Set [Agent Minutes](https://aws.amazon.com/transform/pricing/) budget limit. Transformation exits when limit is reached and can be resumed with an increased limit | 
| -h | --help | - | Display help for command | 

**atx custom def list**

List available transformation definitions.

```
atx custom def list
atx custom def list --json    # Output in JSON format
```

**atx custom def get**

Get details of a specific transformation definition.


| Option | Long Form | Parameter | Description | 
| --- | --- | --- | --- | 
| -n | --transformation-name | <name> | The name of the transformation definition to retrieve | 
| --tv | --transformation-version | <version> | The version of the transformation definition to retrieve | 
| --td | --target-directory | <directory> | The target directory at which to save the transformation definition (default: ".") | 
| --json | --json | - | Output response in JSON format | 
| -h | --help | - | Display help for command | 

**atx custom def delete**

Delete a transformation definition permanently.

```
atx custom def delete -n <transformation-name>
```

**atx custom def publish**

Publish a transformation definition to the registry.


| Option | Long Form | Parameter | Description | 
| --- | --- | --- | --- | 
| -n | --transformation-name | <name> | The name of the transformation definition to publish | 
| --tv | --transformation-version | <version> | The version of the transformation definition to use (not applicable with --description or --source-directory) | 
| --sd | --source-directory | <directory> | The source directory from which to read the local transformation definition files (not applicable with --transformation-version) | 
| --description | --description | <description> | A description for the transformation definition (not applicable with --transformation-version) | 
| --json | --json | - | Output response in JSON format | 
| -h | --help | - | Display help for command | 

**atx custom def save-draft**

Save a transformation definition as a draft in the registry. Drafts expire after 30 days and do not show up in the registry. This command returns the version number of the draft. You must execute and retrieve drafts explicitly using the transformation name and version number.

```
atx custom def save-draft -n <transformation-name> --description "Description" --sd <directory>
```

## Knowledge Item Commands
<a name="custom-knowledge-item-commands"></a>

**atx custom def list-ki**

List knowledge items for a transformation definition.

```
atx custom def list-ki -n <transformation-name>
atx custom def list-ki -n <transformation-name> --json
```

**atx custom def get-ki**

Retrieve a knowledge item from a transformation definition.

```
atx custom def get-ki -n <transformation-name> --id <id>
```

**atx custom def delete-ki**

Delete a knowledge item from a transformation definition.

```
atx custom def delete-ki -n <transformation-name> --id <id>
```

**atx custom def update-ki-status**

Update knowledge item status (ENABLED or DISABLED).

```
atx custom def update-ki-status -n <transformation-name> --id <id> --status ENABLED
```

**atx custom def update-ki-config**

Update knowledge item configuration for auto-approval.

```
atx custom def update-ki-config -n <transformation-name> --auto-enabled TRUE
```

**atx custom def export-ki-markdown**

Export all knowledge items for a transformation definition to markdown.

```
atx custom def export-ki-markdown -n <transformation-name>
```

## Tag Commands
<a name="custom-tag-commands"></a>

**atx custom def list-tags**

List tags for a transformation definition.

```
atx custom def list-tags --arn <transformation-arn>
```

**atx custom def tag**

Add tags to a transformation definition.

```
atx custom def tag --arn <arn> --tags '{"env":"prod","team":"backend"}'
```

**atx custom def untag**

Remove tags from a transformation definition.

```
atx custom def untag --arn <arn> --tag-keys "env,team"
```

## Update Commands
<a name="custom-update-commands"></a>

**atx update**

Update the AWS Transform CLI.

```
atx update                    # Update to latest version
atx update --check            # Check for updates only
atx update --target-version <version>  # Update to specific version
```

## MCP Commands
<a name="custom-mcp-commands"></a>

**atx mcp tools**

Allows clients to view/confirm their MCP tool configurations. Updates to the configurations must be managed directly via the file \$1/.aws/atx/mcp.json 

```
atx mcp tools     # View list of MCP servers
atx mcp tools -s    # List tools for the specified server
```

# AWS-Managed Transformations
<a name="transform-aws-customs"></a>

AWS-managed transformations are pre-built, AWS-vetted transformations for common use cases that are ready to use without any additional setup.

## Overview
<a name="transform-aws-customs-overview"></a>

AWS-managed transformations have the following characteristics:
+ **Validated by AWS** - These transformations are vetted by AWS to be high quality
+ **Ready to use** - No additional setup required
+ **Continuously growing** - Additional transformations are continually being added
+ **Customizable** - Pre-built transformations can be customized by providing additional guidance or requirements specific to your organization's needs using the `additionalPlanContext` configuration parameter
+ **Early access support** - Some transformations may be marked as early access as they undergo further testing and refinement

## Available AWS-Managed Transformations
<a name="transform-aws-customs-available"></a>

The following AWS-managed transformations are currently available:

**Language Version Upgrades:**
+ `AWS/java-version-upgrade` - Upgrade Java applications using any build system from any source JDK version to any target JDK version with comprehensive dependency modernization including Jakarta EE migration, database drivers, ORM frameworks, and Spring ecosystem updates. You can specify your desired target JDK version either through interactive chat with the agent, or by passing an additionalPlanContext configuration parameter.
+ `AWS/python-version-upgrade` - Migrate Python projects from Python 3.8/3.9 to Python 3.11/3.12/3.13, ensuring compatibility with the latest Python features, security updates, and runtime while maintaining functionality and performance. You can specify your desired target Python version either through interactive chat with the agent, or by passing an additionalPlanContext configuration parameter.
+ `AWS/nodejs-version-upgrade` - Upgrade NodeJS applications from any source NodeJS version to any target NodeJS version. You can specify your desired target NodeJS version either through interactive chat with the agent, or by passing an additionalPlanContext configuration parameter.

**AWS SDK Migrations:**
+ `AWS/java-aws-sdk-v1-to-v2` - Upgrade the AWS SDK from V1 to V2 for Java projects using Maven or Gradle.
+ `AWS/python-boto2-to-boto3` - Migrate Python applications from boto2 to boto3, based on the official AWS migration documentation.
+ `AWS/nodejs-aws-sdk-v2-to-v3` - Upgrade Node.js applications from AWS SDK for JavaScript v2 to v3 to leverage modular architecture, first-class TypeScript support, middleware stack, and improved performance while ensuring all AWS service interactions continue to function correctly, without modifying the underlying Node.js version.

**Analysis:**
+ `AWS/comprehensive-codebase-analysis` - This transformation performs deep static analysis of codebases to generate hierarchical, cross-referenced documentation covering all aspects of the system. It combines behavioral analysis, architectural documentation, and business intelligence extraction to create a comprehensive knowledge base organized for maximum usability and navigation. The transformation places special emphasis on technical debt analysis, providing prominent, actionable insights on outdated components and maintenance concerns at the root level.
+ `AWS/java-performance-optimization` - Optimize Java application performance by analyzing JFR profiling data to detect CPU/memory hotspots and anti-patterns, then applying targeted code fixes to reduce resource usage and improve efficiency. For instructions on collecting JFR data, see [https://docs.oracle.com/javacomponents/jmc-5-4/jfr-runtime-guide/run.htm](https://docs.oracle.com/javacomponents/jmc-5-4/jfr-runtime-guide/run.htm).

**Early Access Transformations**

**Note**  
Early access transformations are functional but might be frequently updated based on customer feedback.
+ `AWS/early-access-java-x86-to-graviton` - [Early Access] Validates Java application compatibility with Arm64 architecture for running on AWS Graviton Processors. Identifies and resolves Arm64 incompatibilities by updating dependencies, detecting architecture-specific code patterns, and recompiling native libraries when source code is available. Makes targeted code modifications necessary for Arm64 support, such as architecture detection, and native library loading, but does not perform general code refactoring. Maintains current Java version and JDK distribution and validates compatibility through build and test execution. For optimal results, run in an Arm64-based environment. 
**Note**  
Many modern Java applications are already Arm64-compatible.
+ `AWS/early-access-angular-to-react-migration` - [Early Access] Transform an Angular application to React.
+ `AWS/vue.js-version-upgrade` - [Early Access] An early-access transformation for major version upgrades from Vue.js 2 to Vue.js 3 to modernize components, state management, routing, and global APIs to Vue.js 3 patterns. Minor or patch updates are outside the scope.
+ `AWS/angular-version-upgrade` - [Early Access] This is an early-access transformation to transform an older Angular application to a target Angular version by upgrading components, services, templates, and routing to modern Angular patterns.
+ `AWS/early-access-log4j-to-slf4j-migration` - [Early Access] This transformation migrates Java applications from Log4j (1.x/2.x) to SLF4J with Logback backend. Handles source code, dependency management (Maven/Gradle), and logging configuration files. Validates via compile, test, and residual import scan.

## Customizing AWS-Managed Transformations
<a name="transform-aws-customs-customizing"></a>

You can customize AWS-managed transformations to meet your organization's specific needs by providing additional context through the `additionalPlanContext` configuration parameter.

**Example: Customizing Java version upgrade**

```
codeRepositoryPath: ./my-project
transformationName: AWS/java-version-upgrade
buildCommand: mvn clean install
additionalPlanContext: |
  The target Java version to upgrade to is Java 17.
  Update all internal library dependencies to versions compatible with Java 17.
  Ensure compatibility with our custom authentication framework.
```

**Example: Customizing AWS SDK migration**

```
codeRepositoryPath: ./my-project
transformationName: AWS/java-aws-sdk-v1-to-v2
buildCommand: gradle build
additionalPlanContext: |
  Maintain our existing error handling patterns.
  Use our organization's standard credential provider chain.
  Update logging to use our internal logging framework.
```

# Common Use Cases
<a name="custom-common-use-cases"></a>

This section provides step-by-step guidance for common transformation scenarios using the AWS Transform CLI.

## Lambda Runtime Upgrades
<a name="custom-use-case-lambda-runtime-upgrades"></a>

Lambda periodically deprecates older language runtimes, requiring teams to upgrade their Lambda functions to supported versions. AWS Transform custom can automate these upgrades using the AWS-managed language version upgrade transformations. For example, to upgrade a Lambda function written in Python 3.9 to Python 3.13, you would use the `AWS/python-version-upgrade` transformation. Similar transformations are available for Java (`AWS/java-version-upgrade`) and Node.js (`AWS/nodejs-version-upgrade`), covering the most common Lambda runtime languages.

To upgrade a single Lambda function interactively, navigate to the root of your function's source repository and start the AWS Transform CLI with `atx`. In the interactive session, describe the upgrade you need—for example, "Upgrade this Python 3.9 Lambda function to Python 3.13." The agent will select the appropriate AWS-managed transformation, analyze your code, update language syntax, adjust dependencies, and modernize deprecated API calls. You can provide a build or validation command such as `pytest` or `python -m py_compile *.py` so the agent can verify the transformed code compiles and passes tests. Throughout the interactive session, you can review the agent's proposed changes, provide feedback, and request adjustments before accepting the final result.

When you need to upgrade Lambda runtimes across many repositories, use non-interactive mode to run transformations at scale. For each repository, invoke the CLI with the appropriate flags to run without user input. For example, to upgrade a Node.js 16 Lambda function to Node.js 22:

```
atx custom def exec \
  -n AWS/nodejs-version-upgrade \
  -p ./my-lambda-repo \
  -c "npm test" \
  -g "additionalPlanContext='Upgrade from Node.js 16 to Node.js 22'" \
  -x -t
```

The `-x` flag runs the transformation non-interactively, and `-t` trusts all tool executions so no prompts are required. You can script this command across dozens or hundreds of repositories using a simple shell loop. After each transformation completes, review the Git diff to verify the changes and run your test suite to confirm the upgraded function works correctly. You can also integrate AWS Transform custom into your CI/CD pipeline to identify deprecated Lambda functions on an ongoing basis. This approach not only addresses the immediate problem of deprecated runtimes, but also establishes a continuous process to detect and upgrade functions before they reach end-of-support status.

Central platform teams can create campaigns through the [AWS Transform web application](https://docs.aws.amazon.com/transform/latest/userguide/custom-get-started.html#custom-web-application) to define the transformation, specify target Lambda repositories, and track progress across the organization. For a detailed walkthrough of building a production-grade scaled deployment solution using AWS Batch, see [Building a scalable code modernization solution with AWS Transform custom](https://aws.amazon.com/blogs/devops/building-a-scalable-code-modernization-solution-with-aws-transform-custom/).

# AWS Transform custom and interface endpoints (AWS PrivateLink)
<a name="vpc-interface-endpoints-transform-custom"></a>

You can establish a private connection between your VPC and AWS Transform custom by creating an *interface VPC endpoint*. Interface endpoints are powered by [AWS PrivateLink](https://aws.amazon.com/privatelink), a technology that enables you to privately access AWS Transform custom services without an internet gateway, NAT device, VPN connection, or Direct Connect connection. Traffic between your VPC and AWS Transform custom does not leave the Amazon network. 

Each interface endpoint is represented by one or more [Elastic Network Interfaces](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/using-eni.html) in your subnets. 

For more information, see [Interface VPC endpoints (AWS PrivateLink)](https://docs.aws.amazon.com/vpc/latest/userguide/vpce-interface.html) in the *Amazon VPC User Guide*. 

**Note**  
AWS PrivateLink integration with AWS Transform custom is available in US East (N. Virginia) (us-east-1) and Europe (Frankfurt) (eu-central-1) regions.
You must enable AWS PrivateLink integration for Amazon S3 since AWS Transform custom makes S3 API calls. For detailed instructions, see the [AWS PrivateLink for Amazon S3](https://docs.aws.amazon.com/AmazonS3/latest/userguide/privatelink-interface-endpoints.html) documentation. If you encounter S3 access issues while using AWS Transform custom, refer to our [troubleshooting guide](custom-troubleshooting.md#custom-s3-access-issues).
If you are not using AWS PrivateLink Private DNS feature (see [Private DNS](https://docs.aws.amazon.com/vpc/latest/privatelink/privatelink-access-aws-services.html#interface-endpoint-private-dns)), you must:  
Configure routing to VPC interface endpoints (see the [Routing to VPC interface endpoints](https://docs.aws.amazon.com/Route53/latest/DeveloperGuide/routing-to-vpc-interface-endpoint.html) documentation)
Set the `ATX_CUSTOM_ENDPOINT` environment variable to specify your custom domain, for example:  

    ```
    ATX_CUSTOM_ENDPOINT=https://transform-custom.<region>.api.aws atx
    ```

## Considerations for AWS Transform custom VPC endpoints
<a name="vpc-interface-endpoints-transform-custom-considerations"></a>

Before you set up an interface VPC endpoint for AWS Transform custom, ensure that you review [Interface endpoint properties and limitations](https://docs.aws.amazon.com/vpc/latest/userguide/vpce-interface.html#vpce-interface-limitations) in the *Amazon VPC User Guide*. 

AWS Transform custom supports making calls to all of its API actions through the interface endpoint.

## Prerequisites
<a name="vpc-interface-endpoints-transform-custom-prereq"></a>

Before you begin any of the procedures below, ensure that you have the following:
+ An AWS account with appropriate permissions to create and configure resources.
+ A VPC already created in your AWS account.
+ Familiarity with AWS services, especially Amazon VPC and AWS Transform custom.

## Creating an interface VPC endpoint for AWS Transform custom
<a name="vpc-interface-endpoints-transform-custom-create"></a>

You can create a VPC endpoint for the AWS Transform custom service using either the Amazon VPC console or the AWS Command Line Interface (AWS CLI). For more information, see [Creating an interface endpoint](https://docs.aws.amazon.com/vpc/latest/userguide/vpce-interface.html#create-interface-endpoint) in the *Amazon VPC User Guide*.

Create the following VPC endpoints for AWS Transform custom using this service name: 
+ com.amazonaws.*region*.transform-custom

Replace *region* with AWS Region where you desire to use AWS Transform custom CLI, for example, *com.amazonaws.us-east-1.transform-custom*.

For more information, see [Accessing a service through an interface endpoint](https://docs.aws.amazon.com/vpc/latest/userguide/vpce-interface.html#access-service-though-endpoint) in the *Amazon VPC User Guide*.

## Creating a VPC endpoint policy for AWS Transform custom
<a name="vpc-interface-endpoints-transform-custom-policy"></a>

You can attach an endpoint policy to your VPC endpoint that controls access to AWS Transform custom. The policy specifies the following information:
+ The principal that can perform actions.
+ The actions that can be performed.
+ The resources on which actions can be performed.

For more information, see [Controlling access to services with VPC endpoints](https://docs.aws.amazon.com/vpc/latest/userguide/vpc-endpoints-access.html) in the *Amazon VPC User Guide*. 

**Example: VPC endpoint policy for AWS Transform custom actions**  
The following is an example of an endpoint policy for AWS Transform custom. When attached to an endpoint, this policy grants access to the listed AWS Transform custom actions for all principals on all resources.

```
{
   "Statement":[
      {
         "Principal":"*",
         "Effect":"Allow",
         "Action":[
            "transform-custom:*"
         ],
         "Resource":"*"
      }
   ]
}
```

## Using an on-premises computer to connect to a AWS Transform custom endpoint
<a name="vpc-interface-endpoints-transform-custom-on-prem"></a>

This section describes the process of using an on-premises computer to connect to AWS Transform custom through a AWS PrivateLink endpoint in your AWS VPC.

1. [Create a VPN connection between your on-premises device and your VPC.](https://docs.aws.amazon.com/vpn/latest/clientvpn-user/client-vpn-user-what-is.html)

1. [Create an interface VPC endpoint for AWS Transform custom.](#vpc-interface-endpoints-transform-custom-create)

1. [Set up an inbound Amazon Route 53 endpoint.](https://docs.aws.amazon.com/Route53/latest/DeveloperGuide/routing-to-vpc-interface-endpoint.html) This will enable you to use the DNS name of your AWS Transform custom endpoint from your on-premises device.

# Troubleshooting
<a name="custom-troubleshooting"></a>

This section provides guidance for troubleshooting common issues with AWS Transform custom.

## Log Locations
<a name="custom-log-locations"></a>

AWS Transform CLI maintains logs in the following locations:

**Conversation logs:**

```
~/.aws/atx/custom/<conversation_id>/logs/<timestamp>-conversation.log
```

These logs contain the full conversation history for debugging specific transformation executions.

**Developer debug logs:**

```
~/.aws/atx/logs/debug*.log
~/.aws/atx/logs/error.log
```

These logs provide detailed information about CLI operations and errors. Each log file has a 5MB limit before it rolls over. There may be multiple debug logs in this directory, such as debug1.log and debug2.log.

## Common Issues
<a name="custom-common-issues"></a>

**Installation issues:**

If installation fails, ensure you have Node.js 20 or later installed:

```
node --version
```

Download Node.js from https://nodejs.org/en/download if needed.

**Authentication issues:**

Verify your AWS credentials are configured correctly:

```
aws sts get-caller-identity
```

Ensure your IAM user or role has the required `transform-custom:*` permissions.

**Network connectivity issues:**

If you encounter connection errors, verify network access to required endpoints:
+ `transform-cli.awsstatic.com`
+ `transform-custom.<region>.api.aws`
+ `*.s3.amazonaws.com`

If working in an internet-restricted environment, update firewall rules to allowlist these URLs.

**Region configuration issues:**

If you encounter region-related errors:
+ Verify your region is supported
+ Check for environment variables that may override your configuration: `echo $AWS_REGION $AWS_DEFAULT_REGION`
+ Check your region configuration: `aws configure get region`
+ Update your region if needed: `aws configure set region <your-region>`
+ Check debug logs for region resolution details

**Git issues:**

Ensure Git is installed and your repository is under git source control:

```
git --version
git status
```

AWS Transform custom requires repositories to be under git source control.

**Transformation execution issues:**

If a transformation fails:

1. Review the conversation logs at `~/.aws/atx/custom/<conversation_id>/logs/`

1. Check for build or test failures in the transformation output

1. Verify the build command is correct for your project

1. Try running the transformation in interactive mode to provide feedback

**Conversation resumption issues:**

If you cannot resume a conversation:
+ Verify the conversation is less than 30 days old
+ Check the conversation ID is correct
+ Ensure you have network connectivity

## Getting Support
<a name="custom-getting-support"></a>

For additional assistance, visit AWS Support through the [AWS Console](https://support.console.aws.amazon.com/support/home#/).

When opening a support ticket, include:
+ Conversation logs from `~/.aws/atx/custom/<conversation_id>/logs/`
+ Debug logs from `~/.aws/atx/logs/`
+ Steps to reproduce the issue
+ AWS Transform CLI version (`atx --version`)

## S3 access issues
<a name="custom-s3-access-issues"></a>

The AWS Transform custom service vends S3 pre-signed URLs to facilitate uploading and downloading the potentially large transformation files to/from client machines. This means that not only does the CLI interact with the AWS Transform service endpoints but it also interacts with the S3 service endpoints. These interactions use pre-signed URLs so your client's IAM credentials **do not** require any S3 permissions but sometimes your local machine's proxy server configurations and your network's S3 VPC Endpoint Policies can restrict this traffic.

Please examine the [Log Locations](#custom-log-locations) for more detailed error messages to help root cause any tool failures or S3 access denied relating to downloading and uploading transformation files.

If you are using a VPN/Proxy server that leverages the `https_proxy` and `no_proxy` environment variables, consider adding the following values to your `no_proxy` environment variable value to bypass the proxy for the S3 and AWS Transform custom service endpoints, for example:

`export no_proxy=.s3.amazonaws.com,.transform-custom.<region>.api.aws`

If you are using an [AWS PrivateLink for Amazon S3](https://docs.aws.amazon.com/AmazonS3/latest/userguide/privatelink-interface-endpoints.html) in your VPC, this may have a policy defined to restrict S3 traffic. Please ensure your S3 VPC endpoint policy allows `GetObject` and `PutObject` operations for the AWS Transform custom service-owned buckets. This can be accomplished by adding the following statement to your VPC endpoint policy:

```
{
    "Effect": "Allow",
    "Principal": "*",
    "Action": ["s3:PutObject","s3:GetObject"],
    "Resource": "arn:aws:s3:::aws-transform-custom-*/*"
}
```

Explicit `Deny` statements in policies take precedence over `Allow` statements. Review S3 VPC Endpoint policies for `Deny` statements that may be restricting access.