Amazon Personalize endpoints and quotas
The following sections contain information about Amazon Personalize guidelines,
quotas, and endpoints. For adjustable quotas, you can request a quota
increase using the Service Quotas console
Amazon Personalize endpoints and regions
For a list of Amazon Personalize endpoints by region, see AWS regions and endpoints in the Amazon Web Services General Reference.
Compliance
For information about Amazon Personalize compliance programs, see AWS compliance
Service quotas
Your AWS account has the following quotas for Amazon Personalize.
Resource | Quota |
---|---|
Item interactions | |
Minimum number of unique item interactions required to create a solution version or recommender. For a custom solution, you must have this many records after any filtering by event type or event value before training. | 1000 |
For User-Personalization-v2 and Personalized-Ranking-v2 recipes, the maximum number of item interactions that are considered by a model during training. | 3 billion |
For all domain use cases and custom recipes other than User-Personalization-v2 or Personalized-Ranking-v2, the maximum number of item interactions that are considered by a model during training. | 500 million (adjustable) |
Maximum number of distinct event types combined with total number of optional metadata columns in an Item interactions dataset. | 10 |
Maximum number of metadata columns, excluding reserved fields, in an Item interactions dataset. | 5 |
Maximum number of characters for categorical data and impression values. | 1000 |
Maximum amount of bulk item interactions data per dataset import job with FULL import mode. | 100 GB (increases to 1TB with any increase to Item interactions considered by a model) |
Maximum amount of bulk item interactions data per dataset import job with INCREMENTAL import mode. | 1 GB |
Minimum number of item interactions records per dataset import job with FULL or INCREMENTAL import mode. | 1000 |
Users | |
Minimum number of unique users in item interactions data, with at minimum 2 item interactions each, required to create a domain recommender or custom solution version. | 25 |
Minimum percentage of total users that must have at minimum 2 item interactions or more before you can create a domain recommender or custom solution version. | 1 percent |
Maximum number of metadata fields for a Users dataset. | 25 |
Maximum number of characters for USER_ID data values. | 256 |
Maximum number of characters for categorical data values. | 1000 characters |
Maximum amount of bulk user data per dataset import job with FULL import mode. | 100 GB |
Maximum amount of bulk user data per dataset import job with INCREMENTAL import mode. | 1 GB |
Items | |
For User-Personalization-v2 or Personalized-Ranking-v2, the maximum number of items that are considered by a model during training. These items are from both the Items and Item interactions dataset. | 5 million |
For all domain use cases and custom recipes other than User-Personalization-v2 and Personalized-Ranking-v2, the maximum number of items that are considered by a model during training and generating recommendations. | 750,000 |
Maximum number of metadata fields for an Items dataset. | 100 |
Maximum number of characters for ITEM_ID data values. | 256 |
Maximum number of characters for categorical and non-categorical string data values. | 1000 characters |
Maximum number of textual fields for an Items dataset. | 1 |
Maximum number of characters for textual data values for Chinese and Japanese languages. | 7,000 characters |
Maximum number of characters for textual data values for all other languages. | 20,000 characters |
Maximum amount of bulk items data per dataset import job with BULK import mode. | 100 GB |
Maximum amount of bulk item data per dataset import job with INCREMENTAL import mode. | 1 GB |
Actions | |
Maximum number of actions that are considered by a model during training and generating recommendations. | 1000 |
Maximum number of metadata fields for an Actions dataset. | 10 |
Maximum number of characters for ACTION_ID data values. | 256 |
Maximum number of characters for categorical data values. | 1000 characters |
Maximum amount of bulk actions data per dataset import job with BULK import mode. | 100 GB |
Maximum amount of bulk actions data per dataset import job with INCREMENTAL import mode. | 1 GB |
Action interactions | |
Maximum number of action interactions that are considered by a model during training. | 500 million |
Maximum number of metadata columns, excluding reserved fields, in a Action interactions dataset. | 5 |
Maximum amount of bulk interactions data per dataset import job with FULL import mode. | 100 GB (increases to 1TB with any increase to Action item interactions considered by a model) |
Maximum amount of bulk interactions data per dataset import job with INCREMENTAL import mode. | 1 GB |
Individual record import APIs | |
Maximum rate of PutEvents requests per dataset group. |
1000/second |
Maximum number of events in a PutEvents
call. |
10 |
Maximum size of an event. | 10 KB |
Maximum rate of PutActionInteractions requests per dataset group. |
1000/second |
Maximum number of action interaction events in a PutActionInteractions
call. |
10 |
Maximum size of an action interaction event. | 10 KB |
Maximum rate of PutItems requests per dataset group. |
10/second |
Maximum number of items in a PutItems
call. |
10 |
Maximum rate of PutUsers requests per dataset group. |
10/second |
Maximum number of users in a PutUsers
call. |
10 |
Maximum rate of PutActions requests per dataset group. |
10/second |
Maximum number of users in a PutActions
call. |
10 |
Legacy recipes | |
Maximum amount of combined data for Users and Items datasets for HRNN-metadata and HRNN-Coldstart recipes. | 5 GB |
Maximum number of cold start items the HRNN-Coldstart recipe supports to train a model (create a solution version). | 80000 |
Minimum number of cold start items the HRNN-Coldstart recipe requires to train a model (create a solution version). | 100 |
Filters | |
Total number of filters per dataset group. | 30 (adjustable) |
Maximum number of distinct dataset fields for a filter. | 10 |
Total number of distinct dataset fields across all filters in a dataset group. | 20 |
Maximum number of item interactions per user per event type considered by a filter. | 100 interactions (adjustable) |
Maximum number of action interactions per user per event type considered by a filter. | 300 action interactions (adjustable) |
GetRecommendations / GetPersonalizedRanking / GetActionRecommendations requests | |
Maximum transaction rate for GetRecommendations , GetActionRecommendations and
GetPersonalizedRanking requests. |
2500/sec |
Maximum number of GetRecommendations requests
per second per campaign. |
500/sec |
Maximum number of GetActionRecommendations requests
per second per campaign. |
500/sec |
Maximum number of GetPersonalizedRanking
requests per second per campaign. |
500/sec. |
Maximum number of metadata columns per GetRecommendations or GetPersonalizedRanking
request. |
10 |
Maximum number of recommendation results for a GetRecommendation request without
metadata. |
500 |
Maximum number of recommendation results for a GetRecommendation request with
metadata. |
50 |
Maximum number of items for ranking in a GetPersonalizedRanking request without
metadata. |
500 |
Maximum number of items for ranking in a GetPersonalizedRanking request with
metadata. |
50 |
Metric attribution quotas | |
Maximum number of metrics for a metric attribution | 10 |
Maximum number of unique event attribution sources | 100 |
Batch inference jobs | |
Maximum number of input files for a batch inference job. | 1000 |
Maximum size of batch inference job input. | 1 GB |
Maximum number of records per input file for a batch inference job without themes. | 50 million |
Maximum number of records per input file for a batch inference job with themes. | 100 |
Batch segment jobs | |
Maximum number of input files for a batch segment job. | 1000 |
Maximum size of batch segment job input. | 1 GB |
Maximum number of queries per input file for Item-Affinity recipe. | 500 |
Maximum number of queries per input file for Item-Attribute-Affinity recipe. | 10 |
Maximum number of users per segment | 5 million |
Data deletion jobs | |
Maximum number of data deletion jobs for a dataset group with a status of PENDING. | 5 (adjustable) |
Maximum total size of your data deletion input file or files | 100 MB |
Your AWS account has the following quotas for each region.
Resource | Quota |
---|---|
Total number of active schemas. | 500 |
Total number of active dataset groups. | 5 (adjustable) |
Total number of pending or in progress dataset import jobs. | 5 |
Total number of pending or in progress batch inference jobs. | 5 (adjustable) |
Total number of pending or in progress batch segment jobs. | 5 |
Total number of pending or in progress solution versions. | 20 (adjustable) |
Each dataset group has the following quotas.
Resource | Quota |
---|---|
Total number of active solutions. | 10 (adjustable) |
Total number of active campaigns. | 5 (adjustable) |
Total number of recommenders. | 5 |
Total number of filters. | 30 (adjustable) |
Total number of distinct dataset fields across all filters. | 20 |
Total number of data deletion jobs for a dataset group with a status of PENDING. | 5 |
Requesting a quota increase
For adjustable quotas, you can request a quota increase using the
Service Quotas
console
-
Maximum number of item interactions that are considered by a model during training.
-
Active campaigns per dataset group
-
Active dataset groups
-
Active filters per dataset group
-
Active solutions per dataset group
-
Amount of data per incremental import
-
Maximum number of item interactions per user per event type considered by a filter
-
Total number of pending or in progress batch inference jobs
-
Total number of pending or in progress solution versions
-
Maximum rate of
PutEvents
orPutActionInteraction
requests
To request a quota increase, use the Service Quotas console