

# AWS AppSync JavaScript runtime features for resolvers and functions
<a name="resolver-util-reference-js"></a>

The `APPSYNC_JS` runtime environment provides functionality similar to [ECMAScript (ES) version 6.0](https://262.ecma-international.org/6.0/). It supports a subset of its features and provides some additional methods (utilities) that are not part of the ES specifications. The following topics list all the supported language features:
+  [ Supported runtime features ](https://docs.aws.amazon.com/appsync/latest/devguide/supported-features.html) - Learn more about supported core features, primitive objects, built-in objects and functions, etc.
+  [ Built-in utilities ](https://docs.aws.amazon.com/appsync/latest/devguide/built-in-util-js.html) - The util variable contains general utility methods to help you work with data. Unless otherwise specified, all utilities use the UTF-8 character set.
+  [ Built-in modules ](https://docs.aws.amazon.com/appsync/latest/devguide/built-in-modules-js.html) - Learn more about how built-in modules can help write JavaScript resolvers and functions.
+  [ Runtime utilities ](https://docs.aws.amazon.com/appsync/latest/devguide/runtime-utils-js.html) - The runtime library provides utilities to control or modify the runtime properties of your resolvers and functions.
+  [ Time helpers in util.time ](https://docs.aws.amazon.com/appsync/latest/devguide/time-helpers-in-util-time-js.html) - The util.time variable contains datetime methods to help generate timestamps, convert between datetime formats, and parse datetime strings. The syntax for datetime formats is based on [DateTimeFormatter](https://docs.oracle.com/javase/8/docs/api/java/time/format/DateTimeFormatter.html), which you can reference for further documentation.
+  [ DynamoDB helpers in util.dynamodb ](https://docs.aws.amazon.com/appsync/latest/devguide/dynamodb-helpers-in-util-dynamodb-js.html) - util.dynamodb contains helper methods that make it easier to write and read data to Amazon DynamoDB, such as automatic type mapping and formatting.
+  [ HTTP helpers in util.http ](https://docs.aws.amazon.com/appsync/latest/devguide/http-helpers-in-utils-http-js.html) - The util.http utility provides helper methods that you can use to manage HTTP request parameters and to add response headers.
+  [ Transformation helpers in util.transform ](https://docs.aws.amazon.com/appsync/latest/devguide/transformation-helpers-in-utils-transform-js.html) - util.transform contains helper methods that make it easier to perform complex operations against data sources.
+  [ String helpers in util.str ](https://docs.aws.amazon.com/appsync/latest/devguide/str-helpers-in-util-str-js.html) - util.str contains methods to help with common String operations.
+  [ Extensions ](https://docs.aws.amazon.com/appsync/latest/devguide/extensions-js.html) - extensions contains a set of methods to make additional actions within your resolvers.
+  [ XML helpers in util.xml ](https://docs.aws.amazon.com/appsync/latest/devguide/xml-helpers-in-util-xml-js.html) - util.xml contains methods to help with XML string conversion.

**Note**  
Currently, this reference only applies to runtime version **1.0.0**.

# Supported runtime features
<a name="supported-features"></a>

The sections below describe the supported feature set of the APPSYNC\$1JS runtime.

## Core features
<a name="core-features"></a>

The following core features are supported.

------
#### [ Types ]

The following types are supported:
+ numbers
+ strings
+ booleans
+ objects
+ arrays
+ functions

------
#### [ Operators ]

Operators are supported, including:
+ standard math operators (`+`, `-`, `/`, `%`, `*`, etc.)
+ nullish coalescing operator (`??`)
+ Optional chaining (`?.`)
+ bitwise operators
+ `void` and `typeof` operators
+ spread operators (`...`)

The following operators are not supported:
+ unary operators (`++`, `--`, and `~`)
+ `in` operator
**Note**  
Use the `Object.hasOwn` operator to check if the specified property is in the specified object.

------
#### [ Statements ]

The following statements are supported:
+ `const`
+ `let`
+ `var`
+ `break`
+ `else`
+ `for-in`
+ `for-of` 
+ `if`
+ `return`
+ `switch`
+ spread syntax

The following are not supported:
+ `catch`
+ `continue`
+ `do-while`
+ `finally`
+ `for(initialization; condition; afterthought)`
**Note**  
The exceptions are `for-in` and `for-of` expressions, which are supported.
+ `throw`
+ `try`
+ `while`
+ labeled statements

------
#### [ Literals ]

The following ES 6 [template literals](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Template_literals) are supported:
+ Multi-line strings
+ Expression interpolation
+ Nesting templates

------
#### [ Functions ]

The following function syntax is supported:
+ Function declarations are supported.
+ ES 6 arrow functions are supported.
+ ES 6 rest parameter syntax is supported.

------
#### [ Strict mode ]

Functions operate in strict mode by default, so you don’t need to add a `use_strict` statement in your function code. This cannot be changed.

------

## Primitive objects
<a name="primitive-objects"></a>

The following primitive objects of ES and their functions are supported.

------
#### [ Object ]

The following objects are supported:
+ `Object.assign()`
+ `Object.entries()` 
+ `Object.hasOwn()`
+ `Object.keys()` 
+ `Object.values()`
+ `delete` 

------
#### [ String ]

The following strings are supported:
+  `String.prototype.length()` 
+  `String.prototype.charAt()` 
+  `String.prototype.concat()` 
+  `String.prototype.endsWith()` 
+  `String.prototype.indexOf()` 
+  `String.prototype.lastIndexOf()` 
+  `String.raw()` 
+  `String.prototype.replace()`
**Note**  
Regular expressions are not supported.   
However, Java-styled regular expression constructs are supported in the provided parameter. For more information see [Pattern](https://docs.oracle.com/javase/8/docs/api/java/util/regex/Pattern.html).
+ `String.prototype.replaceAll()`
**Note**  
Regular expressions are not supported.  
However, Java-styled regular expression constructs are supported in the provided parameter. For more information see [Pattern](https://docs.oracle.com/javase/8/docs/api/java/util/regex/Pattern.html).
+  `String.prototype.slice()` 
+  `String.prototype.split()` 
+  `String.prototype.startsWith()` 
+  `String.prototype.toLowerCase()` 
+  `String.prototype.toUpperCase()` 
+  `String.prototype.trim()` 
+  `String.prototype.trimEnd()` 
+  `String.prototype.trimStart()` 

------
#### [ Number ]

The following numbers are supported:
+  `Number.isFinite` 
+  `Number.isNaN` 

------

## Built-in objects and functions
<a name="built-in-objects-functions"></a>

The following functions and objects are supported.

------
#### [ Math ]

The following math functions are supported:
+  `Math.random()` 
+  `Math.min()` 
+  `Math.max()` 
+  `Math.round()` 
+  `Math.floor()` 
+  `Math.ceil()` 

------
#### [ Array ]

The following array methods are supported:
+ `Array.prototype.length` 
+ `Array.prototype.concat()` 
+ `Array.prototype.fill()` 
+ `Array.prototype.flat()` 
+ `Array.prototype.indexOf()` 
+ `Array.prototype.join()` 
+ `Array.prototype.lastIndexOf()` 
+ `Array.prototype.pop()` 
+ `Array.prototype.push()` 
+ `Array.prototype.reverse()` 
+ `Array.prototype.shift()` 
+ `Array.prototype.slice()` 
+ `Array.prototype.sort()`
**Note**  
`Array.prototype.sort()` doesn't support arguments.
+ `Array.prototype.splice()` 
+ `Array.prototype.unshift()`
+ `Array.prototype.forEach()`
+ `Array.prototype.map()`
+ `Array.prototype.flatMap()`
+ `Array.prototype.filter()`
+ `Array.prototype.reduce()`
+ `Array.prototype.reduceRight()`
+ `Array.prototype.find()`
+ `Array.prototype.some()`
+ `Array.prototype.every()`
+ `Array.prototype.findIndex()`
+ `Array.prototype.findLast()`
+ `Array.prototype.findLastIndex()`
+ `delete` 

------
#### [ Console ]

The console object is available for debugging. During live query execution, console log/error statements are sent to Amazon CloudWatch Logs (if logging is enabled). During code evaluation with `evaluateCode`, log statements are returned in the command response.
+ `console.error()`
+ `console.log()`

------
#### [ Function ]
+ The `apply`, `bind`, and `call` methods not are supported.
+ Function constructors are not supported.
+ Passing a function as an argument is not supported.
+ Recursive function calls are not supported.

------
#### [ JSON ]

The following JSON methods are supported:
+ `JSON.parse()`
**Note**  
Returns a blank string if the parsed string is not valid JSON.
+ `JSON.stringify()`

------
#### [ Promises ]

Async processes are not supported, and promises are not supported.

**Note**  
Network and file system access is not supported within the `APPSYNC_JS` runtime in AWS AppSync. AWS AppSync handles all I/O operations based on the requests made by the AWS AppSync resolver or AWS AppSync function.

------

## Globals
<a name="globals"></a>

The following global constants are supported:
+  `NaN` 
+  `Infinity` 
+  `undefined`
+ [https://docs.aws.amazon.com/appsync/latest/devguide/built-in-util-js.html](https://docs.aws.amazon.com/appsync/latest/devguide/built-in-util-js.html)
+ [https://docs.aws.amazon.com/appsync/latest/devguide/extensions-js.html](https://docs.aws.amazon.com/appsync/latest/devguide/extensions-js.html)
+ `runtime`

## Error types
<a name="error-types"></a>

Throwing errors with `throw` is not supported. You can return an error by using `util.error()` function. You can include an error in your GraphQL response by using the `util.appendError` function.

For more information, see [Error utils](https://docs.aws.amazon.com/appsync/latest/devguide/built-in-util-js.html#utility-helpers-in-error-js).

# Built-in utilities
<a name="built-in-util-js"></a>

The `util` variable contains general utility methods to help you work with data. Unless otherwise specified, all utilities use the UTF-8 character set.

## Encoding utils
<a name="utility-helpers-in-encoding"></a>

### Encoding utils list
<a name="utility-helpers-in-encoding-list-js"></a>

 **`util.urlEncode(String)`**  
Returns the input string as an `application/x-www-form-urlencoded` encoded string.

 **`util.urlDecode(String)`**  
Decodes an `application/x-www-form-urlencoded` encoded string back to its non-encoded form.

**`util.base64Encode(string) : string`**  
Encodes the input into a base64-encoded string.

**`util.base64Decode(string) : string`**  
Decodes the data from a base64-encoded string.

## ID generation utils
<a name="utility-helpers-in-id-gen-js"></a>

### ID generation utils list
<a name="utility-helpers-in-id-gen-list-js"></a>

 **`util.autoId()`**  
Returns a 128-bit randomly generated UUID.

**`util.autoUlid()`**  
Returns a 128-bit randomly generated ULID (Universally Unique Lexicographically Sortable Identifier).

**`util.autoKsuid()`**  
Returns a 128-bit randomly generated KSUID (K-Sortable Unique Identifier) base62 encoded as a String with a length of 27.

## Error utils
<a name="utility-helpers-in-error-js"></a>

### Error utils list
<a name="utility-helpers-in-error-list-js"></a>

 **`util.error(String, String?, Object?, Object?)`**  
Throws a custom error. This can be used in request or response mapping templates if the template detects an error with the request or with the invocation result. Additionally, an `errorType` field, a `data` field, and an `errorInfo` field can be specified. The `data` value will be added to the corresponding `error` block inside `errors` in the GraphQL response.  
`data` will be filtered based on the query selection set. The `errorInfo` value will be added to the corresponding `error` block inside `errors` in the GraphQL response.  
`errorInfo` will **not** be filtered based on the query selection set.

 **`util.appendError(String, String?, Object?, Object?)`**  
Appends a custom error. This can be used in request or response mapping templates if the template detects an error with the request or with the invocation result. Additionally, an `errorType` field, a `data` field, and an `errorInfo` field can be specified. Unlike `util.error(String, String?, Object?, Object?)`, the template evaluation will not be interrupted, so that data can be returned to the caller. The `data` value will be added to the corresponding `error` block inside `errors` in the GraphQL response.  
`data` will be filtered based on the query selection set. The `errorInfo` value will be added to the corresponding `error` block inside `errors` in the GraphQL response.  
`errorInfo` will **not** be filtered based on the query selection set.

## Type and pattern matching utils
<a name="utility-helpers-in-patterns-js"></a>

### Type and pattern matching utils list
<a name="utility-helpers-in-patterns-js-list"></a>

**`util.matches(String, String) : Boolean`**  
Returns true if the specified pattern in the first argument matches the supplied data in the second argument. The pattern must be a regular expression such as `util.matches("a*b", "aaaaab")`. The functionality is based on [Pattern](https://docs.oracle.com/javase/7/docs/api/java/util/regex/Pattern.html), which you can reference for further documentation.

 **`util.authType()`**   
Returns a String describing the multi-auth type being used by a request, returning back either "IAM Authorization", "User Pool Authorization", "Open ID Connect Authorization", or "API Key Authorization".

## Return value behavior utils
<a name="utility-helpers-in-cloudwatch-logs-list-js"></a>

### Return value behavior utils list
<a name="utility-helpers-in-behavior-list-js"></a>

 **`util.escapeJavaScript(String)`**  
Returns the input string as a JavaScript escaped string.

## Resolver authorization utils
<a name="utility-helpers-in-resolver-auth-js"></a>

### Resolver authorization utils list
<a name="utility-helpers-in-resolver-auth-list-js"></a>

 **`util.unauthorized()`**  
Throws `Unauthorized` for the field being resolved. Use this in request or response mapping templates to determine whether to allow the caller to resolve the field.

# Built-in modules
<a name="built-in-modules-js"></a>

Modules are a part of the `APPSYNC_JS` runtime and provide utilities to help write JavaScript resolvers and functions. For samples and examples, see the [aws-appsync-resolver-samples](https://github.com/aws-samples/aws-appsync-resolver-samples) GitHub repository.

## DynamoDB module functions
<a name="built-in-ddb-modules"></a>

DynamoDB module functions provide an enhanced experience when interacting with DynamoDB data sources. You can make requests toward your DynamoDB data sources using the functions and without adding type mapping. 

Modules are imported using `@aws-appsync/utils/dynamodb`:

```
// Modules are imported using @aws-appsync/utils/dynamodb
import * as ddb from '@aws-appsync/utils/dynamodb';
```

### Functions
<a name="built-in-ddb-modules-functions"></a>

#### Functions list
<a name="built-in-ddb-modules-functions-list"></a>

 **` get<T>(payload: GetInput): DynamoDBGetItemRequest`**  
See [Inputs](#built-in-ddb-modules-inputs) for information about GetInput.
Generates a `DynamoDBGetItemRequest` object to make a [GetItem](https://docs.aws.amazon.com/appsync/latest/devguide/js-resolver-reference-dynamodb.html#js-aws-appsync-resolver-reference-dynamodb-getitem) request to DynamoDB.  

```
import { get } from '@aws-appsync/utils/dynamodb';

export function request(ctx) {
	return get({ key: { id: ctx.args.id } });
}
```

 **`put<T>(payload): DynamoDBPutItemRequest`**  
Generates a `DynamoDBPutItemRequest` object to make a [PutItem](https://docs.aws.amazon.com/appsync/latest/devguide/js-resolver-reference-dynamodb.html#js-aws-appsync-resolver-reference-dynamodb-putitem) request to DynamoDB.  

```
import * as ddb from '@aws-appsync/utils/dynamodb'

export function request(ctx) {
	return ddb.put({ key: { id: util.autoId() }, item: ctx.args });
}
```

**`remove<T>(payload): DynamoDBDeleteItemRequest`**  
Generates a `DynamoDBDeleteItemRequest` object to make a [DeleteItem](https://docs.aws.amazon.com/appsync/latest/devguide/js-resolver-reference-dynamodb.html#js-aws-appsync-resolver-reference-dynamodb-deleteitem) request to DynamoDB.  

```
import * as ddb from '@aws-appsync/utils/dynamodb';

export function request(ctx) {
	return ddb.remove({ key: { id: ctx.args.id } });
}
```

**`scan<T>(payload): DynamoDBScanRequest`**  
Generates a `DynamoDBScanRequest` to make a [Scan](https://docs.aws.amazon.com/appsync/latest/devguide/js-resolver-reference-dynamodb.html#js-aws-appsync-resolver-reference-dynamodb-scan) request to DynamoDB.  

```
import * as ddb from '@aws-appsync/utils/dynamodb';

export function request(ctx) {
	const { limit = 10, nextToken } = ctx.args;
	return ddb.scan({ limit, nextToken });
}
```

**`sync<T>(payload): DynamoDBSyncRequest`**  
Generates a `DynamoDBSyncRequest` object to make a [Sync](https://docs.aws.amazon.com/appsync/latest/devguide/js-resolver-reference-dynamodb.html#js-aws-appsync-resolver-reference-dynamodb-sync) request. The request only receives the data altered since the last query (delta updates). Requests can only be made to versioned DynamoDB data sources.  

```
import * as ddb from '@aws-appsync/utils/dynamodb';

export function request(ctx) {
	const { limit = 10, nextToken, lastSync } = ctx.args;
	return ddb.sync({ limit, nextToken, lastSync });
}
```

**`update<T>(payload): DynamoDBUpdateItemRequest`**  
Generates a `DynamoDBUpdateItemRequest` object to make an [UpdateItem](https://docs.aws.amazon.com/appsync/latest/devguide/js-resolver-reference-dynamodb.html#js-aws-appsync-resolver-reference-dynamodb-updateitem) request to DynamoDB.

### Operations
<a name="built-in-ddb-modules-operations"></a>

Operation helpers allow you to take specific actions on parts of your data during updates. To get started, import `operations` from `@aws-appsync/utils/dynamodb`:

```
// Modules are imported using operations
import {operations} from '@aws-appsync/utils/dynamodb';
```

#### Operations list
<a name="built-in-ddb-modules-operations-list"></a>

 **`add<T>(payload)`**  
A helper function that adds a new attribute item when updating DynamoDB.  
**Example**  
To add an address (street, city, and zip code) to an existing DynamoDB item using the ID value:  

```
import { update, operations } from '@aws-appsync/utils/dynamodb';

export function request(ctx) {
	const updateObj = {
		address: operations.add({
			street1: '123 Main St',
			city: 'New York',
			zip: '10001',
		}),
	};
	return update({ key: { id: 1 }, update: updateObj });
}
```

**`append <T>(payload)`**  
A helper function that appends a payload to the existing list in DynamoDB.  
**Example**  
To append newly added friend IDs (`newFriendIds`) to an existing friends list (`friendsIds`) during an update:  

```
import { update, operations } from '@aws-appsync/utils/dynamodb';

export function request(ctx) {
	const newFriendIds = [101, 104, 111];
	const updateObj = {
		friendsIds: operations.append(newFriendIds),
	};
	return update({ key: { id: 1 }, update: updateObj });
}
```

**`decrement (by?)`**  
A helper function that decrements the existing attribute value in the item when updating DynamoDB.  
**Example**  
To decrement a friends counter (`friendsCount`) by 10:  

```
import { update, operations } from '@aws-appsync/utils/dynamodb';

export function request(ctx) {
	const updateObj = {
		friendsCount: operations.decrement(10),
	};
	return update({ key: { id: 1 }, update: updateObj });
}
```

**`increment (by?)`**  
A helper function that increments the existing attribute value in the item when updating DynamoDB.  
**Example**  
To increment a friends counter (`friendsCount`) by 10:  

```
import { update, operations } from '@aws-appsync/utils/dynamodb';

export function request(ctx) {
	const updateObj = {
		friendsCount: operations.increment(10),
	};
	return update({ key: { id: 1 }, update: updateObj });
}
```

**`prepend <T>(payload)`**  
A helper function that prepends to the existing list in DynamoDB.  
**Example**  
To prepend newly added friend IDs (`newFriendIds`) to an existing friends list (`friendsIds`) during an update:  

```
import { update, operations } from '@aws-appsync/utils/dynamodb';

export function request(ctx) {
	const newFriendIds = [101, 104, 111];
	const updateObj = {
		friendsIds: operations.prepend(newFriendIds),
	};
	return update({ key: { id: 1 }, update: updateObj });
}
```

**`replace <T>(payload)`**  
A helper function that replaces an existing attribute when updating an item in DynamoDB. This is useful for when you want to update the entire object or subobject in the attribute and not just the keys in the payload.  
**Example**  
To replace an address (street, city, and zip code) in an `info` object:  

```
import { update, operations } from '@aws-appsync/utils/dynamodb';

export function request(ctx) {
	const updateObj = {
		info: {
			address: operations.replace({
				street1: '123 Main St',
				city: 'New York',
				zip: '10001',
			}),
		},
	};
	return update({ key: { id: 1 }, update: updateObj });
}
```

**`updateListItem <T>(payload, index)`**  
A helper function that replaces an item in a list.  
**Example**  
In the scope of the update (`newFriendIds`), this example used `updateListItem` to update the ID values of the second item (index: `1`, new ID: `102`) and third item (index: `2`, new ID: `112`) in a list (`friendsIds`).  

```
import { update, operations as ops } from '@aws-appsync/utils/dynamodb';

export function request(ctx) {
	const newFriendIds = [
		ops.updateListItem('102', 1), ops.updateListItem('112', 2)
	];
	const updateObj = { friendsIds: newFriendIds };
	return update({ key: { id: 1 }, update: updateObj });
}
```

### Inputs
<a name="built-in-ddb-modules-inputs"></a>

#### Inputs list
<a name="built-in-ddb-modules-inputs-list"></a>

 **`Type GetInput<T>`**  

```
GetInput<T>: { 
    consistentRead?: boolean; 
    key: DynamoDBKey<T>; 
}
```
**Type Declaration**  
+ `consistentRead?: boolean` (optional)

  An optional boolean to specify whether you want to perform a strongly consistent read with DynamoDB.
+ `key: DynamoDBKey<T>` (required)

  A required parameter that specifies the key of the item in DynamoDB. DynamoDB items may have a single hash key or hash and sort keys.

**`Type PutInput<T>`**  

```
PutInput<T>: { 
    _version?: number; 
    condition?: DynamoDBFilterObject<T> | null; 
    customPartitionKey?: string; 
    item: Partial<T>; 
    key: DynamoDBKey<T>; 
    populateIndexFields?: boolean; 
}
```
**Type Declaration**  
+ `_version?: number` (optional)
+ `condition?: DynamoDBFilterObject<T> | null` (optional)

  When you put an object in a DynamoDB table, you can optionally specify a conditional expression that controls whether the request should succeed or not based on the state of the object already in DynamoDB before the operation is performed.
+ `customPartitionKey?: string` (optional)

  When enabled, this string value modifies the format of the `ds_sk` and `ds_pk` records used by the delta sync table when versioning has been enabled. When enabled, the processing of the `populateIndexFields` entry is also enabled. 
+ `item: Partial<T>` (required)

  The rest of the attributes of the item to be placed into DynamoDB.
+ `key: DynamoDBKey<T>` (required)

  A required parameter that specifies the key of the item in DynamoDB on which the put will be performed. DynamoDB items may have a single hash key or hash and sort keys.
+ `populateIndexFields?: boolean` (optional)

  A boolean value that, when enabled along with the `customPartitionKey`, creates new entries for each record in the delta sync table, specifically in the `gsi_ds_pk` and `gsi_ds_sk` columns. For more information, see [Conflict detection and sync](https://docs.aws.amazon.com/appsync/latest/devguide/conflict-detection-and-sync.html) in the *AWS AppSync Developer Guide*.

**`Type QueryInput<T>`**  

```
QueryInput<T>: ScanInput<T> & { 
    query: DynamoDBKeyCondition<Required<T>>; 
}
```
**Type Declaration**  
+ `query: DynamoDBKeyCondition<Required<T>>` (required)

  Specifies a key condition that describes items to query. For a given index, the condition for a partition key should be an equality and the sort key a comparison or a `beginsWith` (when it's a string). Only number and string types are supported for partition and sort keys.

  **Example**

  Take the `User` type below:

  ```
  type User = {
    id: string;
    name: string;
    age: number;
    isVerified: boolean;
    friendsIds: string[] 
  }
  ```

  The query can only include the following fields: `id`, `name`, and `age`:

  ```
  const query: QueryInput<User> = {
      name: { eq: 'John' },
      age: { gt: 20 },
  }
  ```

**`Type RemoveInput<T>`**  

```
RemoveInput<T>: { 
    _version?: number; 
    condition?: DynamoDBFilterObject<T>; 
    customPartitionKey?: string; 
    key: DynamoDBKey<T>; 
    populateIndexFields?: boolean; 
}
```
**Type Declaration**  
+ `_version?: number` (optional)
+ `condition?: DynamoDBFilterObject<T>` (optional)

  When you remove an object in DynamoDB, you can optionally specify a conditional expression that controls whether the request should succeed or not based on the state of the object already in DynamoDB before the operation is performed.

  **Example**

  The following example is a `DeleteItem` expression containing a condition that allows the operation succeed only if the owner of the document matches the user making the request.

  ```
  type Task = {
    id: string;
    title: string;
    description: string;
    owner: string;
    isComplete: boolean;
  }
  const condition: DynamoDBFilterObject<Task> = {
    owner: { eq: 'XXXXXXXXXXXXXXXX' },
  }
  
  remove<Task>({
     key: {
       id: 'XXXXXXXXXXXXXXXX',
    },
    condition,
  });
  ```
+ `customPartitionKey?: string` (optional)

  When enabled, the `customPartitionKey` value modifies the format of the `ds_sk` and `ds_pk` records used by the delta sync table when versioning has been enabled. When enabled, the processing of the `populateIndexFields` entry is also enabled. 
+ `key: DynamoDBKey<T>` (required)

  A required parameter that specifies the key of the item in DynamoDB that is being removed. DynamoDB items may have a single hash key or hash and sort keys.

  **Example**

  If a `User` only has the hash key with a user `id`, then the key would look like this:

  ```
  type User = {
  	id: number
  	name: string
  	age: number
  	isVerified: boolean
  }
  const key: DynamoDBKey<User> = {
  	id: 1,
  }
  ```

  If the table user has a hash key (`id`) and sort key (`name`), then the key would look like this:

  ```
  type User = {
  	id: number
  	name: string
  	age: number
  	isVerified: boolean
  	friendsIds: string[]
  }
  
  const key: DynamoDBKey<User> = {
  	id: 1,
  	name: 'XXXXXXXXXX',
  }
  ```
+ `populateIndexFields?: boolean` (optional)

  A boolean value that, when enabled along with the `customPartitionKey`, creates new entries for each record in the delta sync table, specifically in the `gsi_ds_pk` and `gsi_ds_sk` columns.

**`Type ScanInput<T>`**  

```
ScanInput<T>: { 
    consistentRead?: boolean | null; 
    filter?: DynamoDBFilterObject<T> | null; 
    index?: string | null; 
    limit?: number | null; 
    nextToken?: string | null; 
    scanIndexForward?: boolean | null; 
    segment?: number; 
    select?: DynamoDBSelectAttributes; 
    totalSegments?: number; 
}
```
**Type Declaration**  
+ `consistentRead?: boolean | null` (optional)

  An optional boolean to indicate consistent reads when querying DynamoDB. The default value is `false`.
+ `filter?: DynamoDBFilterObject<T> | null` (optional)

  An optional filter to apply to the results after retrieving it from the table.
+ `index?: string | null` (optional)

  An optional name of the index to scan.
+ `limit?: number | null` (optional)

  An optional max number of results to return.
+ `nextToken?: string | null` (optional)

  An optional pagination token to continue a previous query. This would have been obtained from a previous query.
+ `scanIndexForward?: boolean | null` (optional)

  An optional boolean to indicate whether the query is performed in ascending or descending order. By default, this value is set to `true`.
+ `segment?: number` (optional)
+ `select?: DynamoDBSelectAttributes` (optional)

  Attributes to return from DynamoDB. By default, the AWS AppSync DynamoDB resolver only returns attributes that are projected into the index. The supported values are:
  + `ALL_ATTRIBUTES`

    Returns all the item attributes from the specified table or index. If you query a local secondary index, DynamoDB fetches the entire item from the parent table for each matching item in the index. If the index is configured to project all item attributes, all of the data can be obtained from the local secondary index and no fetching is required.
  + `ALL_PROJECTED_ATTRIBUTES`

    Returns all attributes that have been projected into the index. If the index is configured to project all attributes, this return value is equivalent to specifying `ALL_ATTRIBUTES`.
  + `SPECIFIC_ATTRIBUTES`

    Returns only the attributes listed in `ProjectionExpression`. This return value is equivalent to specifying `ProjectionExpression` without specifying any value for `AttributesToGet`.
+ `totalSegments?: number` (optional)

**`Type DynamoDBSyncInput<T>`**  

```
DynamoDBSyncInput<T>: { 
    basePartitionKey?: string; 
    deltaIndexName?: string; 
    filter?: DynamoDBFilterObject<T> | null; 
    lastSync?: number; 
    limit?: number | null; 
    nextToken?: string | null; 
}
```
**Type Declaration**  
+ `basePartitionKey?: string` (optional)

  The partition key of the base table to be used when performing a Sync operation. This field allows a Sync operation to be performed when the table utilizes a custom partition key.
+ `deltaIndexName?: string` (optional)

  The index used for the Sync operation. This index is required to enable a Sync operation on the whole delta store table when the table uses a custom partition key. The Sync operation will be performed on the GSI (created on `gsi_ds_pk` and `gsi_ds_sk`).
+ `filter?: DynamoDBFilterObject<T> | null` (optional)

  An optional filter to apply to the results after retrieving it from the table.
+ `lastSync?: number` (optional)

  The moment, in epoch milliseconds, at which the last successful Sync operation started. If specified, only items that have changed after `lastSync` are returned. This field should only be populated after retrieving all pages from an initial Sync operation. If omitted, results from the base table will be returned. Otherwise, results from the delta table will be returned.
+ `limit?: number | null` (optional)

  An optional maximum number of items to evaluate at a single time. If omitted, the default limit will be set to `100` items. The maximum value for this field is `1000` items.
+ `nextToken?: string | null` (optional)

**`Type DynamoDBUpdateInput<T>`**  

```
DynamoDBUpdateInput<T>: { 
    _version?: number; 
    condition?: DynamoDBFilterObject<T>; 
    customPartitionKey?: string; 
    key: DynamoDBKey<T>; 
    populateIndexFields?: boolean; 
    update: DynamoDBUpdateObject<T>; 
}
```
**Type Declaration**  
+ `_version?: number` (optional)
+ `condition?: DynamoDBFilterObject<T>` (optional)

  When you update an object in DynamoDB, you can optionally specify a conditional expression that controls whether the request should succeed or not based on the state of the object already in DynamoDB before the operation is performed.
+ `customPartitionKey?: string` (optional)

  When enabled, the `customPartitionKey` value modifies the format of the `ds_sk` and `ds_pk` records used by the delta sync table when versioning has been enabled. When enabled, the processing of the `populateIndexFields` entry is also enabled. 
+ `key: DynamoDBKey<T>` (required)

  A required parameter that specifies the key of the item in DynamoDB that is being updated. DynamoDB items may have a single hash key or hash and sort keys.
+ `populateIndexFields?: boolean` (optional)

  A boolean value that, when enabled along with the `customPartitionKey`, creates new entries for each record in the delta sync table, specifically in the `gsi_ds_pk` and `gsi_ds_sk` columns. 
+ `update: DynamoDBUpdateObject<T>`

  An object that specifies the attributes to be updated along with the new values for them. The update object can be used with `add`, `remove`, `replace`, `increment`, `decrement`, `append`, `prepend`, `updateListItem`.

## Amazon RDS module functions
<a name="built-in-rds-modules"></a>

Amazon RDS module functions provide an enhanced experience when interacting with databases configured with the Amazon RDS Data API. The module is imported using `@aws-appsync/utils/rds`: 

```
import * as rds from '@aws-appsync/utils/rds';
```

Functions can also be imported individually. For instance, the import below uses `sql`:

```
import { sql } from '@aws-appsync/utils/rds';
```

### Functions
<a name="built-in-rds-modules-functions"></a>

You can use the AWS AppSync RDS module's utility helpers to interact with your database.

#### Select
<a name="built-in-rds-modules-functions-select"></a>

The `select` utility creates a `SELECT` statement to query your relational database. 

**Basic use**

In its basic form, you can specify the table you want to query:

```
import { select, createPgStatement } from '@aws-appsync/utils/rds';

export function request(ctx) {

    // Generates statement: 
    // "SELECT * FROM "persons"
    return createPgStatement(select({table: 'persons'}));
}
```

Note that you can also specify the schema in your table identifier:

```
import { select, createPgStatement } from '@aws-appsync/utils/rds';

export function request(ctx) {

    // Generates statement:
    // SELECT * FROM "private"."persons"
    return createPgStatement(select({table: 'private.persons'}));
}
```

**Specifying columns**

You can specify columns with the `columns` property. If this isn't set to a value, it defaults to `*`:

```
export function request(ctx) {

    // Generates statement:
    // SELECT "id", "name"
    // FROM "persons"
    return createPgStatement(select({
        table: 'persons',
        columns: ['id', 'name']
    }));
}
```

You can specify a column's table as well:

```
export function request(ctx) {

    // Generates statement: 
    // SELECT "id", "persons"."name"
    // FROM "persons"
    return createPgStatement(select({
        table: 'persons',
        columns: ['id', 'persons.name']
    }));
}
```

**Limits and offsets**

You can apply `limit` and `offset` to the query:

```
export function request(ctx) {

    // Generates statement: 
    // SELECT "id", "name"
    // FROM "persons"
    // LIMIT :limit
    // OFFSET :offset
    return createPgStatement(select({
        table: 'persons',
        columns: ['id', 'name'],
        limit: 10,
        offset: 40
    }));
}
```

**Order By**

You can sort your results with the `orderBy` property. Provide an array of objects specifying the column and an optional `dir` property:

```
export function request(ctx) {

    // Generates statement: 
    // SELECT "id", "name" FROM "persons"
    // ORDER BY "name", "id" DESC
    return createPgStatement(select({
        table: 'persons',
        columns: ['id', 'name'],
        orderBy: [{column: 'name'}, {column: 'id', dir: 'DESC'}]
    }));
}
```

**Filters**

You can build filters by using the special condition object:

```
export function request(ctx) {

    // Generates statement:
    // SELECT "id", "name"
    // FROM "persons"
    // WHERE "name" = :NAME
    return createPgStatement(select({
        table: 'persons',
        columns: ['id', 'name'],
        where: {name: {eq: 'Stephane'}}
    }));
}
```

You can also combine filters:

```
export function request(ctx) {

    // Generates statement:
    // SELECT "id", "name"
    // FROM "persons"
    // WHERE "name" = :NAME and "id" > :ID
    return createPgStatement(select({
        table: 'persons',
        columns: ['id', 'name'],
        where: {name: {eq: 'Stephane'}, id: {gt: 10}}
    }));
}
```

You can also create `OR` statements:

```
export function request(ctx) {

    // Generates statement:
    // SELECT "id", "name"
    // FROM "persons"
    // WHERE "name" = :NAME OR "id" > :ID
    return createPgStatement(select({
        table: 'persons',
        columns: ['id', 'name'],
        where: { or: [
            { name: { eq: 'Stephane'} },
            { id: { gt: 10 } }
        ]}
    }));
}
```

You can also negate a condition with `not`:

```
export function request(ctx) {

    // Generates statement:
    // SELECT "id", "name"
    // FROM "persons"
    // WHERE NOT ("name" = :NAME AND "id" > :ID)
    return createPgStatement(select({
        table: 'persons',
        columns: ['id', 'name'],
        where: { not: [
            { name: { eq: 'Stephane'} },
            { id: { gt: 10 } }
        ]}
    }));
}
```

You can also use the following operators to compare values:


| 
| 
| Operator | Description | Possible value types | 
| --- |--- |--- |
| eq | Equal | number, string, boolean | 
| ne | Not equal | number, string, boolean | 
| le | Less than or equal | number, string | 
| lt | Less than | number, string | 
| ge | Greater than or equal | number, string | 
| gt | Greater than | number, string | 
| contains | Like | string | 
| notContains | Not like | string | 
| beginsWith | Starts with prefix | string | 
| between | Between two values | number, string | 
| attributeExists | The attribute is not null | number, string, boolean | 
| size | checks the length of the element | string | 

#### Insert
<a name="built-in-rds-modules-functions-insert"></a>

The `insert` utility provides a straightforward way of inserting single row items in your database with the `INSERT` operation.

**Single item insertions**

To insert an item, specify the table and then pass in your object of values. The object keys are mapped to your table columns. Columns names are automatically escaped, and values are sent to the database using the variable map:

```
import { insert, createMySQLStatement } from '@aws-appsync/utils/rds';

export function request(ctx) {
    const { input: values } = ctx.args;
    const insertStatement = insert({ table: 'persons', values });
    
    // Generates statement:
    // INSERT INTO `persons`(`name`)
    // VALUES(:NAME)
    return createMySQLStatement(insertStatement)
}
```

**MySQL use case**

You can combine an `insert` followed by a `select` to retrieve your inserted row:

```
import { insert, select, createMySQLStatement } from '@aws-appsync/utils/rds';

export function request(ctx) {
    const { input: values } = ctx.args;
    const insertStatement = insert({  table: 'persons', values });
    const selectStatement = select({
        table: 'persons',
        columns: '*',
        where: { id: { eq: values.id } },
        limit: 1,
    });
    
    // Generates statement:
    // INSERT INTO `persons`(`name`)
    // VALUES(:NAME)
    // and
    // SELECT *
    // FROM `persons`
    // WHERE `id` = :ID
    return createMySQLStatement(insertStatement, selectStatement)
}
```

**Postgres use case**

With Postgres, you can use [https://www.postgresql.org/docs/current/dml-returning.html](https://www.postgresql.org/docs/current/dml-returning.html) to obtain data from the row that you inserted. It accepts `*` or an array of column names:

```
import { insert, createPgStatement } from '@aws-appsync/utils/rds';

export function request(ctx) {
    const { input: values } = ctx.args;
    const insertStatement = insert({
        table: 'persons',
        values,
        returning: '*'
    });

    // Generates statement:
    // INSERT INTO "persons"("name")
    // VALUES(:NAME)
    // RETURNING *
    return createPgStatement(insertStatement)
}
```

#### Update
<a name="built-in-rds-modules-functions-update"></a>

The `update` utility allows you to update existing rows. You can use the condition object to apply changes to the specified columns in all the rows that satisfy the condition. For example, let's say we have a schema that allows us to make this mutation. We want to update the `name` of `Person` with the `id` value of `3` but only if we've known them (`known_since`) since the year `2000`:

```
mutation Update {
    updatePerson(
        input: {id: 3, name: "Jon"},
        condition: {known_since: {ge: "2000"}}
    ) {
    id
    name
  }
}
```

Our update resolver looks like this:

```
import { update, createPgStatement } from '@aws-appsync/utils/rds';

export function request(ctx) {
    const { input: { id, ...values }, condition } = ctx.args;
    const where = {
        ...condition,
        id: { eq: id },
    };
    const updateStatement = update({
        table: 'persons',
        values,
        where,
        returning: ['id', 'name'],
    });

    // Generates statement:
    // UPDATE "persons"
    // SET "name" = :NAME, "birthday" = :BDAY, "country" = :COUNTRY
    // WHERE "id" = :ID
    // RETURNING "id", "name"
    return createPgStatement(updateStatement)
}
```

We can add a check to our condition to make sure that only the row that has the primary key `id` equal to `3` is updated. Similarly, for Postgres `inserts`, you can use `returning` to return the modified data. 

#### Remove
<a name="built-in-rds-modules-functions-remove"></a>

The `remove` utility allows you to delete existing rows. You can use the condition object on all rows that satisfy the condition. Note that `delete` is a reserved keyword in JavaScript. `remove` should be used instead:

```
import { remove, createPgStatement } from '@aws-appsync/utils/rds';

export function request(ctx) {
    const { input: { id }, condition } = ctx.args;
    const where = { ...condition, id: { eq: id } };
    const deleteStatement = remove({
        table: 'persons',
        where,
        returning: ['id', 'name'],
    });

    // Generates statement:
    // DELETE "persons"
    // WHERE "id" = :ID
    // RETURNING "id", "name"
    return createPgStatement(updateStatement)
}
```

### Casting
<a name="built-in-rds-modules-casting"></a>

In some cases, you may want more specificity about the correct object type to use in your statement. You can use the provided type hints to specify the type of your parameters. AWS AppSync supports the [same type hints](https://docs.aws.amazon.com//rdsdataservice/latest/APIReference/API_SqlParameter.html#rdsdtataservice-Type-SqlParameter-typeHint) as the Data API. You can cast your parameters by using the `typeHint` functions from the AWS AppSync `rds` module. 

The following example allows you to send an array as a value that is casted as a JSON object. We use the `->` operator to retrieve the element at the `index` `2` in the JSON array:

```
import { sql, createPgStatement, toJsonObject, typeHint } from '@aws-appsync/utils/rds';

export function request(ctx) {
    const arr = ctx.args.list_of_ids
    const statement = sql`select ${typeHint.JSON(arr)}->2 as value`
    return createPgStatement(statement)
}

export function response(ctx) {
    return toJsonObject(ctx.result)[0][0].value
}
```

Casting is also useful when handling and comparing `DATE`, `TIME`, and `TIMESTAMP`:

```
import { select, createPgStatement, typeHint } from '@aws-appsync/utils/rds';

export function request(ctx) {
    const when = ctx.args.when
    const statement = select({
        table: 'persons',
        where: { createdAt : { gt: typeHint.DATETIME(when) } }
    })
    return createPgStatement(statement)
}
```

Here's another example showing how you can send the current date and time:

```
import { sql, createPgStatement, typeHint } from '@aws-appsync/utils/rds';

export function request(ctx) {
    const now = util.time.nowFormatted('YYYY-MM-dd HH:mm:ss')
    return createPgStatement(sql`select ${typeHint.TIMESTAMP(now)}`)
}
```

**Available type hints**
+ `typeHint.DATE` - The corresponding parameter is sent as an object of the `DATE` type to the database. The accepted format is `YYYY-MM-DD`.
+ `typeHint.DECIMAL` - The corresponding parameter is sent as an object of the `DECIMAL` type to the database.
+ `typeHint.JSON` - The corresponding parameter is sent as an object of the `JSON` type to the database.
+ `typeHint.TIME` - The corresponding string parameter value is sent as an object of the `TIME` type to the database. The accepted format is `HH:MM:SS[.FFF]`. 
+ `typeHint.TIMESTAMP` - The corresponding string parameter value is sent as an object of the `TIMESTAMP` type to the database. The accepted format is `YYYY-MM-DD HH:MM:SS[.FFF]`.
+ `typeHint.UUID` - The corresponding string parameter value is sent as an object of the `UUID` type to the database.

# Runtime utilities
<a name="runtime-utils-js"></a>

The `runtime` library provides utilities to control or modify the runtime properties of your resolvers and functions.

## Runtime utils list
<a name="runtime-utils-list-js"></a>

 **`runtime.earlyReturn(obj?: unknown, returnOptions?: {skipTo: 'END' | 'NEXT'}): never`**  
Invoking this function will halt the execution of the current handler, AWS AppSync function or resolver (Unit or Pipeline Resolver) depending on the current context. The specified object is returned as the result.  
+ When called in an AWS AppSync function request handler, the data source and response handler are skipped, and the next function request handler (or the pipeline resolver response handler if this was the last AWS AppSync function) is called.
+ When called in an AWS AppSync pipeline resolver request handler, the pipeline execution is skipped, and the pipeline resolver response handler is called immediately.
+ When `returnOptions` is given with `skipTo` set to "END", the pipeline execution is skipped, and the pipeline resolver response handler is called immediately.
+ When `returnOptions` is given with `skipTo` set to "NEXT", the function execution is skipped, and the next pipeline handler is called.
**Example**  

```
import { runtime } from '@aws-appsync/utils'

export function request(ctx) {
  runtime.earlyReturn({ hello: 'world' })
  // code below is not executed
  return ctx.args
}

// never called because request returned early
export function response(ctx) {
  return ctx.result
}
```

# Time helpers in util.time
<a name="time-helpers-in-util-time-js"></a>

The `util.time` variable contains datetime methods to help generate timestamps, convert between datetime formats, and parse datetime strings. The syntax for datetime formats is based on [DateTimeFormatter](https://docs.oracle.com/javase/8/docs/api/java/time/format/DateTimeFormatter.html) which you can reference for further documentation.

## Time utils list
<a name="utility-helpers-in-time-list-js"></a>

 **`util.time.nowISO8601()`**  
Returns a String representation of UTC in [ISO8601 format](https://en.wikipedia.org/wiki/ISO_8601).

 **`util.time.nowEpochSeconds()`**  
Returns the number of seconds from the epoch of 1970-01-01T00:00:00Z to now.

 **`util.time.nowEpochMilliSeconds()`**  
Returns the number of milliseconds from the epoch of 1970-01-01T00:00:00Z to now.

 **`util.time.nowFormatted(String)`**  
Returns a string of the current timestamp in UTC using the specified format from a String input type.

 **`util.time.nowFormatted(String, String)`**  
Returns a string of the current timestamp for a timezone using the specified format and timezone from String input types.

 **`util.time.parseFormattedToEpochMilliSeconds(String, String)`**  
Parses a timestamp passed as a String along with a format containing a time zone, then returns the timestamp as milliseconds since epoch.

 **`util.time.parseFormattedToEpochMilliSeconds(String, String, String)`**  
Parses a timestamp passed as a String along with a format and time zone, then returns the timestamp as milliseconds since epoch.

 **`util.time.parseISO8601ToEpochMilliSeconds(String)`**  
Parses an ISO8601 timestamp passed as a String, then returns the timestamp as milliseconds since epoch.

 **`util.time.epochMilliSecondsToSeconds(long)`**  
Converts an epoch milliseconds timestamp to an epoch seconds timestamp.

 **`util.time.epochMilliSecondsToISO8601(long)`**  
Converts an epoch milliseconds timestamp to an ISO8601 timestamp.

 **`util.time.epochMilliSecondsToFormatted(long, String)`**  
Converts an epoch milliseconds timestamp, passed as long, to a timestamp formatted according to the supplied format in UTC.

 **`util.time.epochMilliSecondsToFormatted(long, String, String)`**  
Converts an epoch milliseconds timestamp, passed as a long, to a timestamp formatted according to the supplied format in the supplied timezone.

# DynamoDB helpers in util.dynamodb
<a name="dynamodb-helpers-in-util-dynamodb-js"></a>

`util.dynamodb` contains helper methods that make it easier to write and read data to Amazon DynamoDB, such as automatic type mapping and formatting. 

## toDynamoDB
<a name="utility-helpers-in-toDynamoDB-js"></a>

### toDynamoDB utils list
<a name="utility-helpers-in-toDynamoDB-list-js"></a>

 **`util.dynamodb.toDynamoDB(Object)`**   
General object conversion tool for DynamoDB that converts input objects to the appropriate DynamoDB representation. It's opinionated about how it represents some types: e.g., it will use lists ("L") rather than sets ("SS", "NS", "BS"). This returns an object that describes the DynamoDB attribute value.  
**String example**  

```
Input:      util.dynamodb.toDynamoDB("foo")
Output:     { "S" : "foo" }
```
**Number example**  

```
Input:      util.dynamodb.toDynamoDB(12345)
Output:     { "N" : 12345 }
```
**Boolean example**  

```
Input:      util.dynamodb.toDynamoDB(true)
Output:     { "BOOL" : true }
```
**List example**  

```
Input:      util.dynamodb.toDynamoDB([ "foo", 123, { "bar" : "baz" } ])
Output:     {
               "L" : [
                   { "S" : "foo" },
                   { "N" : 123 },
                   {
                       "M" : {
                           "bar" : { "S" : "baz" }
                       }
                   }
               ]
           }
```
**Map example**  

```
Input:      util.dynamodb.toDynamoDB({ "foo": "bar", "baz" : 1234, "beep": [ "boop"] })
Output:     {
               "M" : {
                   "foo"  : { "S" : "bar" },
                   "baz"  : { "N" : 1234 },
                   "beep" : {
                       "L" : [
                           { "S" : "boop" }
                       ]
                   }
               }
           }
```

## toString utils
<a name="utility-helpers-in-toString-js"></a>

### toString utils list
<a name="utility-helpers-in-toString-list-js"></a>

**`util.dynamodb.toString(String)`**  
Converts an input string to the DynamoDB string format. This returns an object that describes the DynamoDB attribute value.  

```
Input:      util.dynamodb.toString("foo")
Output:     { "S" : "foo" }
```

 **`util.dynamodb.toStringSet(List<String>)`**  
Converts a list with Strings to the DynamoDB string set format. This returns an object that describes the DynamoDB attribute value.  

```
Input:      util.dynamodb.toStringSet([ "foo", "bar", "baz" ])
Output:     { "SS" : [ "foo", "bar", "baz" ] }
```

## toNumber utils
<a name="utility-helpers-in-toNumber-js"></a>

### toNumber utils list
<a name="utility-helpers-in-toNumber-list-js"></a>

 **`util.dynamodb.toNumber(Number)`**  
Converts a number to the DynamoDB number format. This returns an object that describes the DynamoDB attribute value.  

```
Input:      util.dynamodb.toNumber(12345)
Output:     { "N" : 12345 }
```

 **`util.dynamodb.toNumberSet(List<Number>)`**  
Converts a list of numbers to the DynamoDB number set format. This returns an object that describes the DynamoDB attribute value.  

```
Input:      util.dynamodb.toNumberSet([ 1, 23, 4.56 ])
Output:     { "NS" : [ 1, 23, 4.56 ] }
```

## toBinary utils
<a name="utility-helpers-in-toBinary-js"></a>

### toBinary utils list
<a name="utility-helpers-in-toBinary-list-js"></a>

 **`util.dynamodb.toBinary(String)`**  
Converts binary data encoded as a base64 string to DynamoDB binary format. This returns an object that describes the DynamoDB attribute value.  

```
Input:      util.dynamodb.toBinary("foo")
Output:     { "B" : "foo" }
```

 **`util.dynamodb.toBinarySet(List<String>)`**  
Converts a list of binary data encoded as base64 strings to DynamoDB binary set format. This returns an object that describes the DynamoDB attribute value.  

```
Input:      util.dynamodb.toBinarySet([ "foo", "bar", "baz" ])
Output:     { "BS" : [ "foo", "bar", "baz" ] }
```

## toBoolean utils
<a name="utility-helpers-in-toBoolean-js"></a>

### toBoolean utils list
<a name="utility-helpers-in-toBoolean-list-js"></a>

 **`util.dynamodb.toBoolean(Boolean)`**  
Converts a Boolean to the appropriate DynamoDB Boolean format. This returns an object that describes the DynamoDB attribute value.  

```
Input:      util.dynamodb.toBoolean(true)
Output:     { "BOOL" : true }
```

## toNull utils
<a name="utility-helpers-in-toNull-js"></a>

### toNull utils list
<a name="utility-helpers-in-toNull-list-js"></a>

 **`util.dynamodb.toNull()`**  
Returns a null in DynamoDB null format. This returns an object that describes the DynamoDB attribute value.  

```
Input:      util.dynamodb.toNull()
Output:     { "NULL" : null }
```

## toList utils
<a name="utility-helpers-in-toList-js"></a>

### toList utils list
<a name="utility-helpers-in-toList-list-js"></a>

**`util.dynamodb.toList(List)`**  
Converts a list of objects to the DynamoDB list format. Each item in the list is also converted to its appropriate DynamoDB format. It's opinionated about how it represents some of the nested objects: e.g., it will use lists ("L") rather than sets ("SS", "NS", "BS"). This returns an object that describes the DynamoDB attribute value.  

```
Input:      util.dynamodb.toList([ "foo", 123, { "bar" : "baz" } ])
Output:     {
               "L" : [
                   { "S" : "foo" },
                   { "N" : 123 },
                   {
                       "M" : {
                           "bar" : { "S" : "baz" }
                       }
                   }
               ]
           }
```

## toMap utils
<a name="utility-helpers-in-toMap-js"></a>

### toMap utils list
<a name="utility-helpers-in-toMap-list-js"></a>

 **`util.dynamodb.toMap(Map)`**  
Converts a map to the DynamoDB map format. Each value in the map is also converted to its appropriate DynamoDB format. It's opinionated about how it represents some of the nested objects: e.g., it will use lists ("L") rather than sets ("SS", "NS", "BS"). This returns an object that describes the DynamoDB attribute value.  

```
Input:      util.dynamodb.toMap({ "foo": "bar", "baz" : 1234, "beep": [ "boop"] })
Output:     {
               "M" : {
                   "foo"  : { "S" : "bar" },
                   "baz"  : { "N" : 1234 },
                   "beep" : {
                       "L" : [
                           { "S" : "boop" }
                       ]
                   }
               }
           }
```

 **`util.dynamodb.toMapValues(Map)`**  
Creates a copy of the map where each value has been converted to its appropriate DynamoDB format. It's opinionated about how it represents some of the nested objects: e.g., it will use lists ("L") rather than sets ("SS", "NS", "BS").  

```
Input:      util.dynamodb.toMapValues({ "foo": "bar", "baz" : 1234, "beep": [ "boop"] })
Output:     {
               "foo"  : { "S" : "bar" },
               "baz"  : { "N" : 1234 },
               "beep" : {
                   "L" : [
                       { "S" : "boop" }
                   ]
               }
           }
```
This is slightly different to `util.dynamodb.toMap(Map)` as it returns only the contents of the DynamoDB attribute value, but not the whole attribute value itself. For example, the following statements are exactly the same:  

```
util.dynamodb.toMapValues(<map>)
util.dynamodb.toMap(<map>)("M")
```

## S3Object utils
<a name="utility-helpers-in-S3Object-js"></a>

### S3Object utils list
<a name="utility-helpers-in-S3Object-list-js"></a>

**`util.dynamodb.toS3Object(String key, String bucket, String region)`**  
Converts the key, bucket and region into the DynamoDB S3 Object representation. This returns an object that describes the DynamoDB attribute value.  

```
Input:      util.dynamodb.toS3Object("foo", "bar", region = "baz")
Output:     { "S" : "{ \"s3\" : { \"key\" : \"foo", \"bucket\" : \"bar", \"region\" : \"baz" } }" }
```

**`util.dynamodb.toS3Object(String key, String bucket, String region, String version)`**  
Converts the key, bucket, region and optional version into the DynamoDB S3 Object representation. This returns an object that describes the DynamoDB attribute value.  

```
Input:      util.dynamodb.toS3Object("foo", "bar", "baz", "beep")
Output:     { "S" : "{ \"s3\" : { \"key\" : \"foo\", \"bucket\" : \"bar\", \"region\" : \"baz\", \"version\" = \"beep\" } }" }
```

 **`util.dynamodb.fromS3ObjectJson(String)`**  
Accepts the string value of a DynamoDB S3 Object and returns a map that contains the key, bucket, region and optional version.  

```
Input:      util.dynamodb.fromS3ObjectJson({ "S" : "{ \"s3\" : { \"key\" : \"foo\", \"bucket\" : \"bar\", \"region\" : \"baz\", \"version\" = \"beep\" } }" })
Output:     { "key" : "foo", "bucket" : "bar", "region" : "baz", "version" : "beep" }
```

# HTTP helpers in util.http
<a name="http-helpers-in-utils-http-js"></a>

The `util.http` utility provides helper methods that you can use to manage HTTP request parameters and to add response headers.

## util.http utils list
<a name="http-helpers-in-utils-http-list-js"></a>

 **`util.http.copyHeaders(headers)`**  
Copies the headers from the map, excluding the following restricted HTTP headers:  
+ transfer-encoding
+ connection
+ host
+ expect
+ keep-alive
+ upgrade
+ proxy-authenticate
+ proxy-authorization
+ te
+ content-length

**`util.http.addResponseHeader(String, Object)`**  
Adds a single custom header with the name (`String`) and value (`Object`) of the response. The following limitations apply:  
+ In addition to the list of restricted headers for `copyHeaders(headers)`, header names cannot match any of the following:
  + Access-Control-Allow-Credentials
  + Access-Control-Allow-Origin
  + Access-Control-Expose-Headers
  + Access-Control-Max-Age
  + Access-Control-Allow-Methods
  + Access-Control-Allow-Headers
  + Vary
  + Content-Type
+ Header names can't start with the restricted prefixes `x-amzn-` or `x-amz-`.
+ The size of custom response headers can't exceed 4 KB. This includes header names and values.
+ You should define each response header once per GraphQL operation. However, if you define a custom header with the same name multiple times, the most recent definition appears in the response. All headers count towards the header size limit regardless of naming.
+ Headers with an empty or restricted name `(String)` or a null value `(Object)` will be ignored and yield a `ResponseHeaderError` error that is added to the operation's `errors` output.

```
export function request(ctx) {
  util.http.addResponseHeader('itemsCount', 7)
  util.http.addResponseHeader('render', ctx.args.render)
  return {}
}
```

**`util.http.addResponseHeaders(Map)`**  
Adds multiple response headers to the response from the specified map of names `(String)` and values `(Object)`. The same limitations listed for the `addResponseHeader(String, Object)` method also apply to this method.  

```
export function request(ctx) {
  const headers = {
    headerInt: 12,
    headerString: 'stringValue',
    headerObject: {
      field1: 7,
      field2: 'string'
    }
  }
  util.http.addResponseHeaders(headers)
  return {}
}
```

# Transformation helpers in util.transform
<a name="transformation-helpers-in-utils-transform-js"></a>

`util.transform` contains helper methods that make it easier to perform complex operations against data sources.

## Transformation helpers utils list
<a name="transformation-helpers-in-utils-transform-js-list"></a>

**`util.transform.toDynamoDBFilterExpression(filterObject: DynamoDBFilterObject) : string`**  
Converts an input string to a filter expression for use with DynamoDB. We recommend using `toDynamoDBFilterExpression` with [built-in module functions](https://docs.aws.amazon.com/appsync/latest/devguide/built-in-modules-js.html).

**`util.transform.toElasticsearchQueryDSL(object: OpenSearchQueryObject) : string`**  
Converts the given input into its equivalent OpenSearch Query DSL expression, returning it as a JSON string.  
**Example input:**  

```
util.transform.toElasticsearchQueryDSL({
    "upvotes":{
        "ne":15,
        "range":[
            10,
            20
        ]
    },
    "title":{
        "eq":"hihihi",
        "wildcard":"h*i"
    }
  })
```
**Example output:**  

```
{
    "bool":{
      "must":[
          {
            "bool":{
              "must":[
                  {
                    "bool":{
                      "must_not":{
                        "term":{
                          "upvotes":15
                        }
                      }
                    }
                  },
                  {
                    "range":{
                      "upvotes":{
                        "gte":10,
                        "lte":20
                      }
                    }
                  }
              ]
            }
          },
          {
            "bool":{
              "must":[
                  {
                    "term":{
                      "title":"hihihi"
                    }
                  },
                  {
                  "wildcard":{
                      "title":"h*i"
                    }
                  }
              ]
            }
          }
      ]
    }
}
```
The default operator is assumed to be AND.

**`util.transform.toSubscriptionFilter(objFilter, ignoredFields?, rules?): SubscriptionFilter`**  
Converts a `Map` input object to a `SubscriptionFilter` expression object. The `util.transform.toSubscriptionFilter` method is used as an input to the `extensions.setSubscriptionFilter()` extension. For more information, see [Extensions](https://docs.aws.amazon.com/appsync/latest/devguide/extensions-js.html).  
The parameters and return statement is listed below:  
*Parameters*  
+ `objFilter`: `SubscriptionFilterObject`

  A `Map` input object that's converted to the `SubscriptionFilter` expression object.
+ `ignoredFields`: `SubscriptionFilterExcludeKeysType` (optional)

  A `List` of field names in the first object that will be ignored.
+ `rules`: `SubscriptionFilterRuleObject` (optional)

  A `Map` input object with strict rules that's included when you're constructing the `SubscriptionFilter` expression object. These strict rules will be included in the `SubscriptionFilter` expression object so that at least one of the rules will be satisfied to pass the subscription filter.
*Response*  
Returns a `[SubscriptionFilter](https://docs.aws.amazon.com/appsync/latest/devguide/extensions-js.html)`.

**`util.transform.toSubscriptionFilter(Map, List)`**  
Converts a `Map` input object to a `SubscriptionFilter` expression object. The `util.transform.toSubscriptionFilter` method is used as an input to the `extensions.setSubscriptionFilter()` extension. For more information, see [Extensions](https://docs.aws.amazon.com/appsync/latest/devguide/extensions-js.html).  
The first argument is the `Map` input object that's converted to the `SubscriptionFilter` expression object. The second argument is a `List` of field names that are ignored in the first `Map` input object while constructing the `SubscriptionFilter` expression object.

**`util.transform.toSubscriptionFilter(Map, List, Map)`**  
Converts a `Map` input object to a `SubscriptionFilter` expression object. The `util.transform.toSubscriptionFilter` method is used as an input to the `extensions.setSubscriptionFilter()` extension. For more information, see [Extensions](https://docs.aws.amazon.com/appsync/latest/devguide/extensions-js.html). 

**`util.transform.toDynamoDBConditionExpression(conditionObject)`**  
Creates a DynamoDB condition expression.

## Subscription filter arguments
<a name="subscription-filter-arguments-js"></a>

The following table explains the how the arguments of the following utilities are defined:
+ `Util.transform.toSubscriptionFilter(objFilter, ignoredFields?, rules?): SubscriptionFilter`

------
#### [ Argument 1: Map ]

Argument 1 is a `Map` object with the following key values:
+ field names
+ "and"
+ "or"

For field names as keys, the conditions on these fields' entries are in the form of `"operator" : "value"`. 

The following example shows how entries can be added to the `Map`:

```
"field_name" : {
                    "operator1" : value             
               }

## We can have multiple conditions for the same field_name: 

"field_name" : {
                    "operator1" : value             
                    "operator2" : value
                    .
                    .
                    .                  
               }
```

When a field has two or more conditions on it, all of these conditions are considered to use the OR operation.

The input `Map` can also have "and" and "or" as keys, implying that all entries within these should be joined using AND or OR logic depending on the key. The key values "and" and "or" expect an array of conditions.

```
"and" : [
            
            {
                "field_name1" : {
                    "operator1" : value             
                }
             },
             
             {
                "field_name2" : {
                    "operator1" : value             
                }
             },
             .
             .
        ].
```

Note that you can nest "and" and "or". That is, you can have nested "and"/"or" within another "and"/"or" block. However, this doesn't work for simple fields.

```
"and" : [
            
            {
                "field_name1" : {
                    "operator" : value             
                }
             },
             
             {
                "or" : [
                            {
                                "field_name2" : {
                                    "operator" : value             
                                }
                            },
                            
                            {
                                "field_name3" : {
                                    "operator" : value             
                                }
                            }
              
                        ].
```

The following example shows an input of *argument 1* using `util.transform.toSubscriptionFilter(Map) : Map`.

**Input(s)**

Argument 1: Map:

```
{
  "percentageUp": {
    "lte": 50,
    "gte": 20
  },
  "and": [
    {
      "title": {
        "ne": "Book1"
      }
    },
    {
      "downvotes": {
        "gt": 2000
      }
    }
  ],
  "or": [
    {
      "author": {
        "eq": "Admin"
      }
    },
    {
      "isPublished": {
        "eq": false
      }
    }
  ]
}
```

**Output**

The result is a `Map` object:

```
{
  "filterGroup": [
    {
      "filters": [
        {
          "fieldName": "percentageUp",
          "operator": "lte",
          "value": 50
        },
        {
          "fieldName": "title",
          "operator": "ne",
          "value": "Book1"
        },
        {
          "fieldName": "downvotes",
          "operator": "gt",
          "value": 2000
        },
        {
          "fieldName": "author",
          "operator": "eq",
          "value": "Admin"
        }
      ]
    },
    {
      "filters": [
        {
          "fieldName": "percentageUp",
          "operator": "lte",
          "value": 50
        },
        {
          "fieldName": "title",
          "operator": "ne",
          "value": "Book1"
        },
        {
          "fieldName": "downvotes",
          "operator": "gt",
          "value": 2000
        },
        {
          "fieldName": "isPublished",
          "operator": "eq",
          "value": false
        }
      ]
    },
    {
      "filters": [
        {
          "fieldName": "percentageUp",
          "operator": "gte",
          "value": 20
        },
        {
          "fieldName": "title",
          "operator": "ne",
          "value": "Book1"
        },
        {
          "fieldName": "downvotes",
          "operator": "gt",
          "value": 2000
        },
        {
          "fieldName": "author",
          "operator": "eq",
          "value": "Admin"
        }
      ]
    },
    {
      "filters": [
        {
          "fieldName": "percentageUp",
          "operator": "gte",
          "value": 20
        },
        {
          "fieldName": "title",
          "operator": "ne",
          "value": "Book1"
        },
        {
          "fieldName": "downvotes",
          "operator": "gt",
          "value": 2000
        },
        {
          "fieldName": "isPublished",
          "operator": "eq",
          "value": false
        }
      ]
    }
  ]
}
```

------
#### [ Argument 2: List ]

Argument 2 contains a `List` of field names that shouldn't be considered in the input `Map` (argument 1) while constructing the `SubscriptionFilter` expression object. The `List` can also be empty.

The following example shows the inputs of argument 1 and argument 2 using `util.transform.toSubscriptionFilter(Map, List) : Map`.

**Input(s)**

Argument 1: Map:

```
{
  "percentageUp": {
    "lte": 50,
    "gte": 20
  },
  "and": [
    {
      "title": {
        "ne": "Book1"
      }
    },
    {
      "downvotes": {
        "gt": 20
      }
    }
  ],
  "or": [
    {
      "author": {
        "eq": "Admin"
      }
    },
    {
      "isPublished": {
        "eq": false
      }
    }
  ]
}
```

Argument 2: List:

```
["percentageUp", "author"]
```

**Output**

The result is a `Map` object:

```
{
  "filterGroup": [
    {
      "filters": [
        {
          "fieldName": "title",
          "operator": "ne",
          "value": "Book1"
        },
        {
          "fieldName": "downvotes",
          "operator": "gt",
          "value": 20
        },
        {
          "fieldName": "isPublished",
          "operator": "eq",
          "value": false
        }
      ]
    }
  ]
}
```

------
#### [ Argument 3: Map ]

Argument 3 is a `Map` object that has field names as key values (cannot have "and" or "or"). For field names as keys, the conditions on these fields are entries in the form of `"operator" : "value"`. Unlike argument 1, argument 3 cannot have multiple conditions in the same key. In addition, argument 3 doesn't have an "and" or "or" clause, so there's no nesting involved either.

Argument 3 represents a list of strict rules, which are added to the `SubscriptionFilter` expression object so that **at least one** of these conditions is met to pass the filter.

```
{
  "fieldname1": {
    "operator": value
  },
  "fieldname2": {
    "operator": value
  }
}
.
.
.
```

The following example shows the inputs of *argument 1*, *argument 2*, and *argument 3* using `util.transform.toSubscriptionFilter(Map, List, Map) : Map`.

**Input(s)**

Argument 1: Map:

```
{
  "percentageUp": {
    "lte": 50,
    "gte": 20
  },
  "and": [
    {
      "title": {
        "ne": "Book1"
      }
    },
    {
      "downvotes": {
        "lt": 20
      }
    }
  ],
  "or": [
    {
      "author": {
        "eq": "Admin"
      }
    },
    {
      "isPublished": {
        "eq": false
      }
    }
  ]
}
```

Argument 2: List:

```
["percentageUp", "author"]
```

Argument 3: Map:

```
{
  "upvotes": {
    "gte": 250
  },
  "author": {
    "eq": "Person1"
  }
}
```

**Output**

The result is a `Map` object:

```
{
  "filterGroup": [
    {
      "filters": [
        {
          "fieldName": "title",
          "operator": "ne",
          "value": "Book1"
        },
        {
          "fieldName": "downvotes",
          "operator": "gt",
          "value": 20
        },
        {
          "fieldName": "isPublished",
          "operator": "eq",
          "value": false
        },
        {
          "fieldName": "upvotes",
          "operator": "gte",
          "value": 250
        }
      ]
    },
    {
      "filters": [
        {
          "fieldName": "title",
          "operator": "ne",
          "value": "Book1"
        },
        {
          "fieldName": "downvotes",
          "operator": "gt",
          "value": 20
        },
        {
          "fieldName": "isPublished",
          "operator": "eq",
          "value": false
        },
        {
          "fieldName": "author",
          "operator": "eq",
          "value": "Person1"
        }
      ]
    }
  ]
}
```

------

# String helpers in util.str
<a name="str-helpers-in-util-str-js"></a>

 `util.str` contains methods to help with common String operations. 

## util.str utils list
<a name="str-helpers-in-util-str-list-js"></a>

 **`util.str.normalize(String, String)`**  
Normalizes a string using one of the four unicode normalization forms: NFC, NFD, NFKC, or NFKD. The first argument is the string to normalize. The second argument is either "nfc", "nfd", "nfkc", or "nfkd" specifying the normalization type to use for the normalization process.

# Extensions
<a name="extensions-js"></a>

`extensions` contains a set of methods to make additional actions within your resolvers.

## Caching extensions
<a name="caching-extensions-js-list"></a>

**`extensions.evictFromApiCache(typeName: string, fieldName: string, keyValuePair: Record<string, any>) : Object`**  
Evicts an item from the AWS AppSync server-side cache. The first argument is the type name. The second argument is the field name. The third argument is an object containing key-value pair items that specify the caching key value. You must put the items in the object in the same order as the caching keys in the cached resolver's `cachingKey`. For more information about caching, see [Caching behavior](https://docs.aws.amazon.com/appsync/latest/devguide/enabling-caching.html#caching-behavior).  
**Example 1:**  
This example evicts the items that were cached for a resolver called `Query.allClasses` on which a caching key called `context.arguments.semester` was used. When the mutation is called and the resolver runs, if an entry is successfully cleared, then the response contains an `apiCacheEntriesDeleted` value in the extensions object that shows how many entries were deleted.  

```
import { util, extensions } from '@aws-appsync/utils';

export const request = (ctx) => ({ payload: null });

export function response(ctx) {
	extensions.evictFromApiCache('Query', 'allClasses', {
		'context.arguments.semester': ctx.args.semester,
	});
	return null;
}
```
This function **only** works for mutations, not queries.

## Subscription extensions
<a name="subscription-extensions-js-list"></a>

**`extensions.setSubscriptionFilter(filterJsonObject)`**  
Defines enhanced subscription filters. Each subscription notification event is evaluated against provided subscription filters and delivers notifications to clients if all filters evaluate to `true`. The argument is `filterJsonObject` (More information about this argument can be found below in the *Argument: filterJsonObject* section.). See [Enhanced subscription filtering](https://docs.aws.amazon.com/appsync/latest/devguide/aws-appsync-real-time-enhanced-filtering.html).  
You can use this extension function only in the response handler of a subscription resolver. Also, we recommend using `util.transform.toSubscriptionFilter` to create your filter.

**`extensions.setSubscriptionInvalidationFilter(filterJsonObject)`**  
Defines subscription invalidation filters. Subscription filters are evaluated against the invalidation payload, then invalidate a given subscription if the filters evaluate to `true`. The argument is `filterJsonObject` (More information about this argument can be found below in the *Argument: filterJsonObject* section.). See [Enhanced subscription filtering](https://docs.aws.amazon.com/appsync/latest/devguide/aws-appsync-real-time-enhanced-filtering.html).  
You can use this extension function only in the response handler of a subscription resolver. Also, we recommend using `util.transform.toSubscriptionFilter` to create your filter.

**`extensions.invalidateSubscriptions(invalidationJsonObject)`**  
Used to initiate a subscription invalidation from a mutation. The argument is `invalidationJsonObject` (More information about this argument can be found below in the *Argument: invalidationJsonObject* section.).  
This extension can be used only in the response mapping templates of the mutation resolvers.  
You can only use at most five unique `extensions.invalidateSubscriptions()` method calls in any single request. If you exceed this limit, you will receive a GraphQL error.

## Argument: filterJsonObject
<a name="extensions-filterJsonObject-js"></a>

The JSON object defines either subscription or invalidation filters. It's an array of filters in a `filterGroup`. Each filter is a collection of individual filters.

```
{
    "filterGroup": [
        {
           "filters" : [
                 {
                    "fieldName" : "userId",
                    "operator" : "eq",
                    "value" : 1
                }
           ]
           
        },
        {
           "filters" : [
                {
                    "fieldName" : "group",
                    "operator" : "in",
                    "value" : ["Admin", "Developer"]
                }
           ]
           
        }
    ]
}
```

Each filter has three attributes: 
+ `fieldName` – The GraphQL schema field.
+ `operator` – The operator type.
+ `value` – The values to compare to the subscription notification `fieldName` value.

The following is an example assignment of these attributes:

```
{
 "fieldName" : "severity",
 "operator" : "le",
 "value" : context.result.severity
}
```

## Argument: invalidationJsonObject
<a name="extensions-invalidationJsonObject-js"></a>

The `invalidationJsonObject` defines the following:
+ `subscriptionField` – The GraphQL schema subscription to invalidate. A single subscription, defined as a string in the `subscriptionField`, is considered for invalidation.
+ `payload` – A key-value pair list that's used as the input for invalidating subscriptions if the invalidation filter evaluates to `true` against their values.

  The following example invalidates subscribed and connected clients using the `onUserDelete` subscription when the invalidation filter defined in the subscription resolver evaluates to `true` against the `payload` value.

  ```
  export const request = (ctx) => ({ payload: null });
  
  export function response(ctx) {
  	extensions.invalidateSubscriptions({
  		subscriptionField: 'onUserDelete',
  		payload: { group: 'Developer', type: 'Full-Time' },
  	});
  	return ctx.result;
  }
  ```

# XML helpers in util.xml
<a name="xml-helpers-in-util-xml-js"></a>

 `util.xml` contains methods to help with XML string conversion. 

## util.xml utils list
<a name="xml-helpers-in-util-xml-list-js"></a>

 **`util.xml.toMap(String) : Object`**  
Converts a XML string to a dictionary.  
**Example 1:**  

```
Input:

<?xml version="1.0" encoding="UTF-8"?>
<posts>
<post>
    <id>1</id>
    <title>Getting started with GraphQL</title>
</post>
</posts>

Output (object):

{
    "posts":{
      "post":{
        "id":1,
        "title":"Getting started with GraphQL"
      }
    }
}
```
**Example 2:**  

```
Input:

<?xml version="1.0" encoding="UTF-8"?>
<posts>
<post>
  <id>1</id>
  <title>Getting started with GraphQL</title>
</post>
<post>
  <id>2</id>
  <title>Getting started with AppSync</title>
</post>
</posts>

Output (JavaScript object):

{
    "posts":{
    "post":[
        {
            "id":1,
            "title":"Getting started with GraphQL"
        },
        {
            "id":2,
            "title":"Getting started with AppSync"
        }
    ]
    }
}
```

**`util.xml.toJsonString(String, Boolean?) : String`**  
Converts a XML string to a JSON string. This is similar to `toMap`, except that the output is a string. This is useful if you want to directly convert and return the XML response from an HTTP object to JSON. You can set an optional boolean parameter to determine if you want to string-encode the JSON.