Monitor Amazon Data Firehose Using CloudWatch Logs - Amazon Data Firehose

Firehose supports database as a source in all AWS Regions except China Regions, AWS GovCloud (US) Regions, and Asia Pacific (Malaysia). This feature is in preview and is subject to change. Do not use it for your production workloads.

Monitor Amazon Data Firehose Using CloudWatch Logs

Amazon Data Firehose integrates with Amazon CloudWatch Logs so that you can view the specific error logs when the Lambda invocation for data transformation or data delivery fails. You can enable Amazon Data Firehose error logging when you create your Firehose stream.

If you enable Amazon Data Firehose error logging in the Amazon Data Firehose console, a log group and corresponding log streams are created for the Firehose stream on your behalf. The format of the log group name is /aws/kinesisfirehose/delivery-stream-name, where delivery-stream-name is the name of the corresponding Firehose stream. DestinationDelivery is the log stream that is created and used to log any errors related to the delivery to the primary destination. Another log stream called BackupDelivery is created only if S3 backup is enabled for the destination. The BackupDelivery log stream is used to log any errors related to the delivery to the S3 backup.

For example, if you create a Firehose stream "MyStream" with Amazon Redshift as the destination and enable Amazon Data Firehose error logging, the following are created on your behalf: a log group named aws/kinesisfirehose/MyStream and two log streams named DestinationDelivery and BackupDelivery. In this example, DestinationDelivery will be used to log any errors related to the delivery to the Amazon Redshift destination and also to the intermediate S3 destination. BackupDelivery, in case S3 backup is enabled, will be used to log any errors related to the delivery to the S3 backup bucket.

You can enable Amazon Data Firehose error logging through the AWS CLI, the API, or AWS CloudFormation using the CloudWatchLoggingOptions configuration. To do so, create a log group and a log stream in advance. We recommend reserving that log group and log stream for Amazon Data Firehose error logging exclusively. Also ensure that the associated IAM policy has "logs:putLogEvents" permission. For more information, see Controlling access with Amazon Data Firehose.

Note that Amazon Data Firehose does not guarantee that all delivery error logs are sent to CloudWatch Logs. In circumstances where delivery failure rate is high, Amazon Data Firehose samples delivery error logs before sending them to CloudWatch Logs.

There is a nominal charge for error logs sent to CloudWatch Logs. For more information, see Amazon CloudWatch Pricing.

Data delivery errors

The following is a list of data delivery error codes and messages for each Amazon Data Firehose destination. Each error message also describes the proper action to take to fix the issue.

Amazon S3 Data delivery errors

Amazon Data Firehose can send the following Amazon S3-related errors to CloudWatch Logs.

Error Code Error Message and Information
S3.KMS.NotFoundException

"The provided AWS KMS key was not found. If you are using what you believe to be a valid AWS KMS key with the correct role, check if there is a problem with the account to which the AWS KMS key is attached."

S3.KMS.RequestLimitExceeded

"The KMS request per second limit was exceeded while attempting to encrypt S3 objects. Increase the request per second limit."

For more information, see Limits in the AWS Key Management Service Developer Guide.

S3.AccessDenied "Access was denied. Ensure that the trust policy for the provided IAM role allows Amazon Data Firehose to assume the role, and the access policy allows access to the S3 bucket."
S3.AccountProblem "There is a problem with your AWS account that prevents the operation from completing successfully. Contact AWS Support."
S3.AllAccessDisabled "Access to the account provided has been disabled. Contact AWS Support."
S3.InvalidPayer "Access to the account provided has been disabled. Contact AWS Support."
S3.NotSignedUp "The account is not signed up for Amazon S3. Sign the account up or use a different account."
S3.NoSuchBucket "The specified bucket does not exist. Create the bucket or use a different bucket that does exist."
S3.MethodNotAllowed "The specified method is not allowed against this resource. Modify the bucket’s policy to allow the correct Amazon S3 operation permissions."
InternalError "An internal error occurred while attempting to deliver data. Delivery will be retried; if the error persists, then it will be reported to AWS for resolution."
S3.KMS.KeyDisabled "The provided KMS key is disabled. Enable the key or use a different key."
S3.KMS.InvalidStateException "The provided KMS key is in an invalid state. Please use a different key."
KMS.InvalidStateException "The provided KMS key is in an invalid state. Please use a different key."
KMS.DisabledException "The provided KMS key is disabled. Please fix the key or use a different key."
S3.SlowDown "The rate of put request to the specified bucket was too high. Increase Firehose stream buffer size or reduce put requests from other applications."
S3.SubscriptionRequired "Access was denied when calling S3. Ensure that the IAM role and the KMS Key (if provided) passed in has Amazon S3 subscription."
S3.InvalidToken "The provided token is malformed or otherwise invalid. Please check the credentials provided."
S3.KMS.KeyNotConfigured "KMS key not configured. Configure your KMSMasterKeyID, or disable encryption for your S3 bucket."
S3.KMS.AsymmetricCMKNotSupported "Amazon S3 supports only symmetric CMKs. You cannot use an asymmetric CMK to encrypt your data in Amazon S3. To get the type of your CMK, use the KMS DescribeKey operation."
S3.IllegalLocationConstraintException "Firehose currently uses s3 global endpoint for data delivery to the configured s3 bucket. The region of the configured s3 bucket doesn't support s3 global endpoint. Please create a Firehose stream in the same region as the s3 bucket or use s3 bucket in the region that supports global endpoint."
S3.InvalidPrefixConfigurationException "The custom s3 prefix used for the timestamp evaluation is invalid. Check your s3 prefix contains valid expressions for the current date and time of the year."
DataFormatConversion.MalformedData "Illegal character found between tokens."

Apache Iceberg Tables Data Delivery Errors

For Apache Iceberg Tables data delivery errors, see Deliver data to Apache Iceberg Tables with Amazon Data Firehose.

Amazon Redshift Data delivery errors

Amazon Data Firehose can send the following Amazon Redshift-related errors to CloudWatch Logs.

Error Code Error Message and Information
Redshift.TableNotFound

"The table to which to load data was not found. Ensure that the specified table exists."

The destination table in Amazon Redshift to which data should be copied from S3 was not found. Note that Amazon Data Firehose does not create the Amazon Redshift table if it does not exist.

Redshift.SyntaxError "The COPY command contains a syntax error. Retry the command."
Redshift.AuthenticationFailed "The provided user name and password failed authentication. Provide a valid user name and password."
Redshift.AccessDenied "Access was denied. Ensure that the trust policy for the provided IAM role allows Amazon Data Firehose to assume the role."
Redshift.S3BucketAccessDenied "The COPY command was unable to access the S3 bucket. Ensure that the access policy for the provided IAM role allows access to the S3 bucket."
Redshift.DataLoadFailed "Loading data into the table failed. Check STL_LOAD_ERRORS system table for details."
Redshift.ColumnNotFound "A column in the COPY command does not exist in the table. Specify a valid column name."
Redshift.DatabaseNotFound "The database specified in the Amazon Redshift destination configuration or JDBC URL was not found. Specify a valid database name."
Redshift.IncorrectCopyOptions

"Conflicting or redundant COPY options were provided. Some options are not compatible in certain combinations. Check the COPY command reference for more info."

For more information, see the Amazon Redshift COPY command in the Amazon Redshift Database Developer Guide.

Redshift.MissingColumn "There is a column defined in the table schema as NOT NULL without a DEFAULT value and not included in the column list. Exclude this column, ensure that the loaded data always provides a value for this column, or add a default value to the Amazon Redshift schema for this table."
Redshift.ConnectionFailed "The connection to the specified Amazon Redshift cluster failed. Ensure that security settings allow Amazon Data Firehose connections, that the cluster or database specified in the Amazon Redshift destination configuration or JDBC URL is correct, and that the cluster is available."
Redshift.ColumnMismatch "The number of jsonpaths in the COPY command and the number of columns in the destination table should match. Retry the command."
Redshift.IncorrectOrMissingRegion "Amazon Redshift attempted to use the wrong region endpoint for accessing the S3 bucket. Either specify a correct region value in the COPY command options or ensure that the S3 bucket is in the same region as the Amazon Redshift database."
Redshift.IncorrectJsonPathsFile "The provided jsonpaths file is not in a supported JSON format. Retry the command."
Redshift.MissingS3File "One or more S3 files required by Amazon Redshift have been removed from the S3 bucket. Check the S3 bucket policies to remove any automatic deletion of S3 files."
Redshift.InsufficientPrivilege "The user does not have permissions to load data into the table. Check the Amazon Redshift user permissions for the INSERT privilege."
Redshift.ReadOnlyCluster "The query cannot be executed because the system is in resize mode. Try the query again later."
Redshift.DiskFull "Data could not be loaded because the disk is full. Increase the capacity of the Amazon Redshift cluster or delete unused data to free disk space."
InternalError "An internal error occurred while attempting to deliver data. Delivery will be retried; if the error persists, then it will be reported to AWS for resolution."
Redshift.ArgumentNotSupported "The COPY command contains unsupported options."
Redshift.AnalyzeTableAccessDenied "Access denied. Copy from S3 to Redshift is failing because analyze table can only be done by table or database owner."
Redshift.SchemaNotFound "The schema specified in the DataTableName of Amazon Redshift destination configuration was not found. Specify a valid schema name."
Redshift.ColumnSpecifiedMoreThanOnce "There is a column specified more than once in the column list. Ensure that duplicate columns are removed."
Redshift.ColumnNotNullWithoutDefault "There is a non-null column without DEFAULT that is not included in the column list. Ensure that such columns are included in the column list."
Redshift.IncorrectBucketRegion "Redshift attempted to use a bucket in a different region from the cluster. Please specify a bucket within the same region as the cluster."
Redshift.S3SlowDown "High request rate to S3. Reduce the rate to avoid getting throttled."
Redshift.InvalidCopyOptionForJson "Please use either auto or a valid S3 path for json copyOption."
Redshift.InvalidCopyOptionJSONPathFormat "COPY failed with error \"Invalid JSONPath format. Array index is out of range\". Please rectify the JSONPath expression."
Redshift.InvalidCopyOptionRBACAclNotAllowed "COPY failed with error \"Cannot use RBAC acl framework while permission propagation is not enabled.\"
Redshift.DiskSpaceQuotaExceeded "Transaction aborted due to disk space quota exceed. Free up disk space or request increased quota for the schema(s)."
Redshift.ConnectionsLimitExceeded "Connection limit exceeded for user."
Redshift.SslNotSupported "The connection to the specified Amazon Redshift cluster failed because the server does not support SSL. Please check your cluster settings."
Redshift.HoseNotFound "The hose has been deleted. Please check the status of your hose."
Redshift.Delimiter "The copyOptions delimiter in the copyCommand is invalid. Ensure that it is a single character."
Redshift.QueryCancelled "The user has canceled the COPY operation."
Redshift.CompressionMismatch "Hose is configured with UNCOMPRESSED, but copyOption includes a compression format."
Redshift.EncryptionCredentials "The ENCRYPTED option requires credentials in the format: 'aws_iam_role=...;master_symmetric_key=...' or 'aws_access_key_id=...;aws_secret_access_key=...[;token=...];master_symmetric_key=...'"
Redshift.InvalidCopyOptions "Invalid COPY configuration options."
Redshift.InvalidMessageFormat "Copy command contains an invalid character."
Redshift.TransactionIdLimitReached "Transaction ID limit reached."
Redshift.DestinationRemoved "Please verify that the redshift destination exists and is configured correctly in the Firehose configuration."
Redshift.OutOfMemory "The Redshift cluster is running out of memory. Please ensure the cluster has sufficient capacity."
Redshift.CannotForkProcess "The Redshift cluster is running out of memory. Please ensure the cluster has sufficient capacity."
Redshift.SslFailure "The SSL connection closed during the handshake."
Redshift.Resize "The Redshift cluster is resizing. Firehose will not be able to deliver data while the cluster is resizing."
Redshift.ImproperQualifiedName "The qualified name is improper (too many dotted names)."
Redshift.InvalidJsonPathFormat "Invalid JSONPath Format."
Redshift.TooManyConnectionsException "Too many connections to Redshift."
Redshift.PSQLException "PSQlException observed from Redshift."
Redshift.DuplicateSecondsSpecification "Duplicate seconds specification in date/time format."
Redshift.RelationCouldNotBeOpened "Encountered Redshift error, relation could not be opened. Check Redshift logs for the specified DB."
Redshift.TooManyClients "Encountered too many clients exception from Redshift. Revisit max connections to the database if there are multiple producers writing to it simultaneously."

Snowflake Data delivery errors

Firehose can send the following Snowflake-related errors to CloudWatch Logs.

Error Code Error Message and Information
Snowflake.InvalidUrl

"Firehose is unable to connect to Snowflake. Please make sure that Account url is specified correctly in Snowflake destination configuration."

Snowflake.InvalidUser

"Firehose is unable to connect to Snowflake. Please make sure that User is specified correctly in Snowflake destination configuration."

Snowflake.InvalidRole

"The specified snowflake role does not exist or is not authorized. Please make sure that the role is granted to the user specified"

Snowflake.InvalidTable

"The supplied table does not exist or is not authorized"

Snowflake.InvalidSchema

"The supplied schema does not exist or is not authorized"

Snowflake.InvalidDatabase

"The supplied database does not exist or is not authorized"

Snowflake.InvalidPrivateKeyOrPassphrase

"The specified private key or passphrase is not valid. Note that the private key provided should be a valid PEM RSA private key"

Snowflake.MissingColumns

"The insert request is rejected due to missing columns in input payload. Make sure that values are specified for all non-nullable columns"

Snowflake.ExtraColumns

"The insert request is rejected due to extra columns. Columns not present in table shouldn't be specified"

Snowflake.InvalidInput

"Delivery failed due to invalid input format. Make sure that the input payload provided is in the JSON format acceptable"

Snowflake.IncorrectValue

"Delivery failed due to incorrect data type in the input payload. Make sure that the JSON values specified in input payload adhere to the datatype declared in Snowflake table definition"

Splunk Data delivery errors

Amazon Data Firehose can send the following Splunk-related errors to CloudWatch Logs.

Error Code Error Message and Information
Splunk.ProxyWithoutStickySessions

"If you have a proxy (ELB or other) between Amazon Data Firehose and the HEC node, you must enable sticky sessions to support HEC ACKs."

Splunk.DisabledToken "The HEC token is disabled. Enable the token to allow data delivery to Splunk."
Splunk.InvalidToken "The HEC token is invalid. Update Amazon Data Firehose with a valid HEC token."
Splunk.InvalidDataFormat "The data is not formatted correctly. To see how to properly format data for Raw or Event HEC endpoints, see Splunk Event Data."
Splunk.InvalidIndex "The HEC token or input is configured with an invalid index. Check your index configuration and try again."
Splunk.ServerError "Data delivery to Splunk failed due to a server error from the HEC node. Amazon Data Firehose will retry sending the data if the retry duration in your Amazon Data Firehose is greater than 0. If all the retries fail, Amazon Data Firehose backs up the data to Amazon S3."
Splunk.DisabledAck "Indexer acknowledgement is disabled for the HEC token. Enable indexer acknowledgement and try again. For more info, see Enable indexer acknowledgement."
Splunk.AckTimeout "Did not receive an acknowledgement from HEC before the HEC acknowledgement timeout expired. Despite the acknowledgement timeout, it's possible the data was indexed successfully in Splunk. Amazon Data Firehose backs up in Amazon S3 data for which the acknowledgement timeout expired."
Splunk.MaxRetriesFailed

"Failed to deliver data to Splunk or to receive acknowledgment. Check your HEC health and try again."

Splunk.ConnectionTimeout "The connection to Splunk timed out. This might be a transient error and the request will be retried. Amazon Data Firehose backs up the data to Amazon S3 if all retries fail."
Splunk.InvalidEndpoint "Could not connect to the HEC endpoint. Make sure that the HEC endpoint URL is valid and reachable from Amazon Data Firehose."
Splunk.ConnectionClosed "Unable to send data to Splunk due to a connection failure. This might be a transient error. Increasing the retry duration in your Amazon Data Firehose configuration might guard against such transient failures."
Splunk.SSLUnverified "Could not connect to the HEC endpoint. The host does not match the certificate provided by the peer. Make sure that the certificate and the host are valid."
Splunk.SSLHandshake "Could not connect to the HEC endpoint. Make sure that the certificate and the host are valid."
Splunk.URLNotFound "The requested URL was not found on the Splunk server. Please check the Splunk cluster and make sure it is configured correctly."
Splunk.ServerError.ContentTooLarge "Data delivery to Splunk failed due to a server error with a statusCode: 413, message: the request your client sent was too large. See splunk docs to configure max_content_length."
Splunk.IndexerBusy "Data delivery to Splunk failed due to a server error from the HEC node. Make sure HEC endpoint or the Elastic Load Balancer is reachable and is healthy."
Splunk.ConnectionRecycled "The connection from Firehose to Splunk has been recycled. Delivery will be retried."
Splunk.AcknowledgementsDisabled "Could not get acknowledgements on POST. Make sure that acknowledgements are enabled on HEC endpoint."
Splunk.InvalidHecResponseCharacter "Invalid characters found in HEC response, make sure to check to the service and HEC configuration."

ElasticSearch Data delivery errors

Amazon Data Firehose can send the following ElasticSearch errors to CloudWatch Logs.

Error Code Error Message and Information
ES.AccessDenied "Access was denied. Ensure that the provided IAM role associated with firehose is not deleted."
ES.ResourceNotFound "The specified AWS Elasticsearch domain does not exist."

HTTPS Endpoint Data delivery errors

Amazon Data Firehose can send the following HTTP Endpoint-related errors to CloudWatch Logs. If none of these errors are a match to the problem that you're experiencing, the default error is the following: "An internal error occurred while attempting to deliver data. Delivery will be retried; if the error persists, then it will be reported to AWS for resolution."

Error Code Error Message and Information
HttpEndpoint.RequestTimeout

The delivery timed out before a response was received and will be retried. If this error persists, contact the AWS Firehose service team.

HttpEndpoint.ResponseTooLarge "The response received from the endpoint is too large. Contact the owner of the endpoint to resolve this issue."
HttpEndpoint.InvalidResponseFromDestination "The response received from the specified endpoint is invalid. Contact the owner of the endpoint to resolve the issue."
HttpEndpoint.DestinationException "The following response was received from the endpoint destination."
HttpEndpoint.ConnectionFailed "Unable to connect to the destination endpoint. Contact the owner of the endpoint to resolve this issue."
HttpEndpoint.ConnectionReset "Unable to maintain connection with the endpoint. Contact the owner of the endpoint to resolve this issue."
HttpEndpoint.ConnectionReset "Trouble maintaining connection with the endpoint. Please reach out to the owner of the endpoint."
HttpEndpoint.ResponseReasonPhraseExceededLimit "The response reason phrase received from the endpoint exceed the configured limit of 64 characters."
HttpEndpoint.InvalidResponseFromDestination "The response received from the endpoint is invalid. See Troubleshooting HTTP Endpoints in the Firehose documentation for more information. Reason: "
HttpEndpoint.DestinationException "Delivery to the endpoint was unsuccessful. See Troubleshooting HTTP Endpoints in the Firehose documentation for more information. Response received with status code "
HttpEndpoint.InvalidStatusCode "Received an invalid response status code."
HttpEndpoint.SSLHandshakeFailure "Unable to complete an SSL Handshake with the endpoint. Contact the owner of the endpoint to resolve this issue."
HttpEndpoint.SSLHandshakeFailure "Unable to complete an SSL Handshake with the endpoint. Contact the owner of the endpoint to resolve this issue."
HttpEndpoint.SSLFailure "Unable to complete TLS handshake with the endpoint. Contact the owner of the endpoint to resolve this issue."
HttpEndpoint.SSLHandshakeCertificatePathFailure "Unable to complete an SSL Handshake with the endpoint due to invalid certification path. Contact the owner of the endpoint to resolve this issue."
HttpEndpoint.SSLHandshakeCertificatePathValidationFailure "Unable to complete an SSL Handshake with the endpoint due to certification path validation failure. Contact the owner of the endpoint to resolve this issue."
HttpEndpoint.MakeRequestFailure.IllegalUriException "HttpEndpoint request failed due to invalid input in URI. Please make sure all the characters in the input URI are valid."
HttpEndpoint.MakeRequestFailure.IllegalCharacterInHeaderValue "HttpEndpoint request failed due to illegal response error. Illegal character '\n' in header value."
HttpEndpoint.IllegalResponseFailure "HttpEndpoint request failed due to illegal response error. HTTP message must not contain more than one Content-Type header."
HttpEndpoint.IllegalMessageStart "HttpEndpoint request failed due to illegal response error. Illegal HTTP message start. See Troubleshooting HTTP Endpoints in the Firehose documentation for more information."

Amazon OpenSearch Service Data delivery errors

For the OpenSearch Service destination, Amazon Data Firehose sends errors to CloudWatch Logs as they are returned by OpenSearch Service.

In addition to errors that may return from OpenSearch clusters, you may encounter the following two errors:

  • Authentication/authorization error occurs during attempt to deliver data to destination OpenSearch Service cluster. This can happen due to any permission issues and/or intermittently when your Amazon Data Firehose target OpenSearch Service domain configuration is modified. Please check the cluster policy and role permissions.

  • Data couldn’t be delivered to destination OpenSearch Service cluster due to authentication/authorization failures. This can happen due to any permission issues and/or intermittently when your Amazon Data Firehose target OpenSearch Service domain configuration is modified. Please check the cluster policy and role permissions.

Error Code Error Message and Information
OS.AccessDenied "Access was denied. Ensure that the trust policy for the provided IAM role allows Firehose to assume the role, and the access policy allows access to the Amazon OpenSearch Service API."
OS.AccessDenied "Access was denied. Ensure that the trust policy for the provided IAM role allows Firehose to assume the role, and the access policy allows access to the Amazon OpenSearch Service API."
OS.AccessDenied "Access was denied. Ensure that the provided IAM role associated with firehose is not deleted."
OS.AccessDenied "Access was denied. Ensure that the provided IAM role associated with firehose is not deleted."
OS.ResourceNotFound "The specified Amazon OpenSearch Service domain does not exist."
OS.ResourceNotFound "The specified Amazon OpenSearch Service domain does not exist."
OS.AccessDenied "Access was denied. Ensure that the trust policy for the provided IAM role allows Firehose to assume the role, and the access policy allows access to the Amazon OpenSearch Service API."
OS.RequestTimeout "Request to the Amazon OpenSearch Service cluster or OpenSearch Serverless collection timed out. Ensure that the cluster or collection has sufficient capacity for the current workload."
OS.ClusterError "The Amazon OpenSearch Service cluster returned an unspecified error."
OS.RequestTimeout "Request to the Amazon OpenSearch Service cluster timed out. Ensure that the cluster has sufficient capacity for the current workload."
OS.ConnectionFailed "Trouble connecting to the Amazon OpenSearch Service cluster or OpenSearch Serverless collection. Ensure that the cluster or collection is healthy and reachable."
OS.ConnectionReset "Unable to maintain connection with the Amazon OpenSearch Service cluster or OpenSearch Serverless collection. Contact the owner of the cluster or collection to resolve this issue."
OS.ConnectionReset "Trouble maintaining connection with the Amazon OpenSearch Service cluster or OpenSearch Serverless collection. Ensure that the cluster or collection is healthy and has sufficient capacity for the current workload."
OS.ConnectionReset "Trouble maintaining connection with the Amazon OpenSearch Service cluster or OpenSearch Serverless collection. Ensure that the cluster or collection is healthy and has sufficient capacity for the current workload."
OS.AccessDenied "Access was denied. Ensure that the access policy on the Amazon OpenSearch Service cluster grants access to the configured IAM role."
OS.ValidationException "The OpenSearch cluster returned a ESServiceException. One of the reasons is that the cluster has been upgraded to OS 2.x or higher, but the hose still has the TypeName parameter configured. Update the hose configuration by setting the TypeName to an empty string, or change the endpoint to the cluster, that supports the Type parameter."
OS.ValidationException "Member must satisfy regular expression pattern: [a-z][a-z0-9\\-]+
OS.JsonParseException "The Amazon OpenSearch Service cluster returned a JsonParseException. Ensure that the data being put is valid."
OS.AmazonOpenSearchServiceParseException "The Amazon OpenSearch Service cluster returned an AmazonOpenSearchServiceParseException. Ensure that the data being put is valid."
OS.ExplicitIndexInBulkNotAllowed "Ensure rest.action.multi.allow_explicit_index is set to true on the Amazon OpenSearch Service cluster."
OS.ClusterError "The Amazon OpenSearch Service cluster or OpenSearch Serverless collection returned an unspecified error."
OS.ClusterBlockException "The cluster returned a ClusterBlockException. It may be overloaded."
OS.InvalidARN "The Amazon OpenSearch Service ARN provided is invalid. Please check your DeliveryStream configuration."
OS.MalformedData "One or more records are malformed. Please ensure that each record is single valid JSON object and that it does not contain newlines."
OS.InternalError "An internal error occurred when attempting to deliver data. Delivery will be retried; if the error persists, it will be reported to AWS for resolution."
OS.AliasWithMultipleIndicesNotAllowed "Alias has more than one indices associated with it. Ensure that the alias has only one index associated with it."
OS.UnsupportedVersion "Amazon OpenSearch Service 6.0 is not currently supported by Amazon Data Firehose. Contact AWS Support for more information."
OS.CharConversionException "One or more records contained an invalid character."
OS.InvalidDomainNameLength "The domain name length is not within valid OS limits."
OS.VPCDomainNotSupported "Amazon OpenSearch Service domains within VPCs are currently not supported."
OS.ConnectionError "The http server closed the connection unexpectedly, please verify the health of the Amazon OpenSearch Service cluster or OpenSearch Serverless collection."
OS.LargeFieldData "The Amazon OpenSearch Service cluster aborted the request as it contained a field data larger than allowed."
OS.BadGateway "The Amazon OpenSearch Service cluster or OpenSearch Serverless collection aborted the request with a response: 502 Bad Gateway."
OS.ServiceException "Error received from the Amazon OpenSearch Service cluster or OpenSearch Serverless collection. If the cluster or collection is behind a VPC, ensure network configuration allows connectivity."
OS.GatewayTimeout "Firehose encountered timeout errors when connecting to the Amazon OpenSearch Service cluster or OpenSearch Serverless collection."
OS.MalformedData "Amazon Data Firehose does not support Amazon OpenSearch Service Bulk API commands inside the Firehose record."
OS.ResponseEntryCountMismatch "The response from the Bulk API contained more entries than the number of records sent. Ensure that each record contains only one JSON object and that there are no newlines."

Lambda invocation errors

Amazon Data Firehose can send the following Lambda invocation errors to CloudWatch Logs.

Error Code Error Message and Information
Lambda.AssumeRoleAccessDenied

"Access was denied. Ensure that the trust policy for the provided IAM role allows Amazon Data Firehose to assume the role."

Lambda.InvokeAccessDenied

"Access was denied. Ensure that the access policy allows access to the Lambda function."

Lambda.JsonProcessingException

"There was an error parsing returned records from the Lambda function. Ensure that the returned records follow the status model required by Amazon Data Firehose."

For more information, see Required parameters for data transformation.

Lambda.InvokeLimitExceeded

"The Lambda concurrent execution limit is exceeded. Increase the concurrent execution limit."

For more information, see AWS Lambda Limits in the AWS Lambda Developer Guide.

Lambda.DuplicatedRecordId

"Multiple records were returned with the same record ID. Ensure that the Lambda function returns unique record IDs for each record."

For more information, see Required parameters for data transformation.

Lambda.MissingRecordId

"One or more record IDs were not returned. Ensure that the Lambda function returns all received record IDs."

For more information, see Required parameters for data transformation.

Lambda.ResourceNotFound

"The specified Lambda function does not exist. Use a different function that does exist."

Lambda.InvalidSubnetIDException

"The specified subnet ID in the Lambda function VPC configuration is invalid. Ensure that the subnet ID is valid."

Lambda.InvalidSecurityGroupIDException

"The specified security group ID in the Lambda function VPC configuration is invalid. Ensure that the security group ID is valid."

Lambda.SubnetIPAddressLimitReachedException

"AWS Lambda was not able to set up the VPC access for the Lambda function because one or more configured subnets have no available IP addresses. Increase the IP address limit."

For more information, see Amazon VPC Limits - VPC and Subnets in the Amazon VPC User Guide.

Lambda.ENILimitReachedException

"AWS Lambda was not able to create an Elastic Network Interface (ENI) in the VPC, specified as part of the Lambda function configuration, because the limit for network interfaces has been reached. Increase the network interface limit."

For more information, see Amazon VPC Limits - Network Interfaces in the Amazon VPC User Guide.

Lambda.FunctionTimedOut

The Lambda function invocation timed out. Increase the Timeout setting in the Lambda function. For more information, see Configuring function timeout.

Lambda.FunctionError

This can be due to any of the following errors:

  • Invalid output structure. Check your function and make sure the output is in the required format. Also, make sure the processed records contain a valid result status of Dropped, Ok, or ProcessingFailed.

  • The Lambda function was successfully invoked but it returned an error result.

  • Lambda was unable to decrypt the environment variables because KMS access was denied. Check the function's KMS key settings as well as the key policy. For more information, see Troubleshooting Key Access.

Lambda.FunctionRequestTimedOut

Amazon Data Firehose encountered Request did not complete before the request timeout configuration error when invoking Lambda. Revisit the Lambda code to check if the Lambda code is meant to run beyond the configured timeout. If so, consider tuning Lambda configuration settings, including memory, timeout. For more information, see Configuring Lambda function options.

Lambda.TargetServerFailedToRespond

Amazon Data Firehose encountered an error. Target server failed to respond error when calling the AWS Lambda service.

Lambda.InvalidZipFileException

Amazon Data Firehose encountered InvalidZipFileException when invoking the Lambda function. Check your Lambda function configuration settings and the Lambda code zip file.

Lambda.InternalServerError

"Amazon Data Firehose encountered InternalServerError when calling the AWS Lambda service. Amazon Data Firehose will retry sending data a fixed number of times. You can specify or override the retry options using the CreateDeliveryStream or UpdateDestination APIs. If the error persists, contact AWS Lambda support team.

Lambda.ServiceUnavailable

Amazon Data Firehose encountered ServiceUnavailableException when calling the AWS Lambda service. Amazon Data Firehose will retry sending data a fixed number of times. You can specify or override the retry options using the CreateDeliveryStream or UpdateDestination APIs. If the error persists, contact AWS Lambda support.

Lambda.InvalidSecurityToken

Cannot invoke Lambda function due to invalid security token. Cross partition Lambda invocation is not supported.

Lambda.InvocationFailure

This can be due to any of the following errors:

  • Amazon Data Firehose encountered errors when calling AWS Lambda. The operation will be retried; if the error persists, it will be reported to AWS for resolution."

  • Amazon Data Firehose encountered a KMSInvalidStateException from Lambda. Lambda was unable to decrypt the environment variables because the KMS key used is in an invalid state for Decrypt. Check the lambda function's KMS key.

  • Amazon Data Firehose encountered an AWSLambdaException from Lambda. Lambda was unable to initialize the provided container image. Verify the image.

  • Amazon Data Firehose encountered timeout errors when calling AWS Lambda. The maximum supported function timeout is 5 minutes. For more information, see Data Transformation Execution Duration.

Lambda.JsonMappingException

There was an error parsing returned records from the Lambda function. Ensure that data field is base-64 encoded.

Kinesis invocation errors

Amazon Data Firehose can send the following Kinesis invocation errors to CloudWatch Logs.

Error Code Error Message and Information
Kinesis.AccessDenied "Access was denied when calling Kinesis. Ensure the access policy on the IAM role used allows access to the appropriate Kinesis APIs."
Kinesis.ResourceNotFound "Firehose failed to read from the stream. If the Firehose is attached with Kinesis Stream, the stream may not exist, or the shard may have been merged or split. If the Firehose is of DirectPut type, the Firehose may not exist any more."
Kinesis.SubscriptionRequired "Access was denied when calling Kinesis. Ensure that the IAM role passed for Kinesis stream access has AWS Kinesis subscription."
Kinesis.Throttling "Throttling error encountered when calling Kinesis. This can be due to other applications calling the same APIs as the Firehose stream, or because you have created too many Firehose streams with the same Kinesis stream as the source."
Kinesis.Throttling "Throttling error encountered when calling Kinesis. This can be due to other applications calling the same APIs as the Firehose stream, or because you have created too many Firehose streams with the same Kinesis stream as the source."
Kinesis.AccessDenied "Access was denied when calling Kinesis. Ensure the access policy on the IAM role used allows access to the appropriate Kinesis APIs."
Kinesis.AccessDenied "Access was denied while trying to call API operations on the underlying Kinesis Stream. Ensure that the IAM role is propagated and valid."
Kinesis.KMS.AccessDeniedException "Firehose does not have access to the KMS Key used to encrypt/decrypt the Kinesis Stream. Please grant the Firehose delivery role access to the key."
Kinesis.KMS.KeyDisabled "Firehose is unable to read from the source Kinesis Stream because the KMS key used to encrypt/decrypt it is disabled. Enable the key so that reads can proceed."
Kinesis.KMS.InvalidStateException "Firehose is unable to read from the source Kinesis Stream because the KMS key used to encrypt it is in an invalid state."
Kinesis.KMS.NotFoundException "Firehose is unable to read from the source Kinesis Stream because the KMS key used to encrypt it was not found."

Kinesis DirectPut invocation errors

Amazon Data Firehose can send the following Kinesis DirectPut invocation errors to CloudWatch Logs.

Error Code Error Message and Information
Firehose.KMS.AccessDeniedException "Firehose does not have access to the KMS Key. Please check the key policy."
Firehose.KMS.InvalidStateException "Firehose is unable to decrypt the data because the KMS key used to encrypt it is in an invalid state."
Firehose.KMS.NotFoundException "Firehose is unable to decrypt the data because the KMS key used to encrypt it was not found."
Firehose.KMS.KeyDisabled "Firehose is unable to decrypt the data because the KMS key used to encrypt the data is disabled. Enable the key so that data delivery can proceed."

AWS Glue invocation errors

Amazon Data Firehose can send the following AWS Glue invocation errors to CloudWatch Logs.

Error Code Error Message and Information
DataFormatConversion.InvalidSchema "The schema is invalid."
DataFormatConversion.EntityNotFound "The specified table/database could not be found. Please ensure that the table/database exists and that the values provided in the schema configuration are correct, especially with regards to casing."
DataFormatConversion.InvalidInput "Could not find a matching schema from glue. Please make sure the specified database with the supplied catalog ID exists."
DataFormatConversion.InvalidInput "Could not find a matching schema from glue. Please make sure the passed ARN is in the correct format."
DataFormatConversion.InvalidInput "Could not find a matching schema from glue. Please make sure the catalogId provided is valid."
DataFormatConversion.InvalidVersionId "Could not find a matching schema from glue. Please make sure the specified version of the table exists."
DataFormatConversion.NonExistentColumns "Could not find a matching schema from glue. Please make sure the table is configured with a non-null storage descriptor containing the target columns."
DataFormatConversion.AccessDenied "Access was denied when assuming role. Please ensure that the role specified in the data format conversion configuration has granted the Firehose service permission to assume it."
DataFormatConversion.ThrottledByGlue "Throttling error encountered when calling Glue. Either increase the request rate limit or reduce the current rate of calling glue through other applications."
DataFormatConversion.AccessDenied "Access was denied when calling Glue. Please ensure that the role specified in the data format conversion configuration has the necessary permissions."
DataFormatConversion.InvalidGlueRole "Invalid role. Please ensure that the role specified in the data format conversion configuration exists."
DataFormatConversion.InvalidGlueRole "The security token included in the request is invalid. Ensure that the provided IAM role associated with firehose is not deleted."
DataFormatConversion.GlueNotAvailableInRegion "AWS Glue is not yet available in the region you have specified; please specify a different region."
DataFormatConversion.GlueEncryptionException "There was an error retrieving the master key. Ensure that the key exists and has the correct access permissions."
DataFormatConversion.SchemaValidationTimeout "Timed out while retrieving table from Glue. If you have a large number of Glue table versions, please add 'glue:GetTableVersion' permission (recommended) or delete unused table versions. If you do not have a large number of tables in Glue, please contact AWS Support."
DataFirehose.InternalError "Timed out while retrieving table from Glue. If you have a large number of Glue table versions, please add 'glue:GetTableVersion' permission (recommended) or delete unused table versions. If you do not have a large number of tables in Glue, please contact AWS Support."
DataFormatConversion.GlueEncryptionException "There was an error retrieving the master key. Ensure that the key exists and state is correct."

DataFormatConversion invocation errors

Amazon Data Firehose can send the following DataFormatConversion invocation errors to CloudWatch Logs.

Error Code Error Message and Information
DataFormatConversion.InvalidSchema "The schema is invalid."
DataFormatConversion.ValidationException "Column names and types must be non-empty strings."
DataFormatConversion.ParseError "Encountered malformed JSON."
DataFormatConversion.MalformedData "Data does not match the schema."
DataFormatConversion.MalformedData "Length of json key must not be greater than 262144"
DataFormatConversion.MalformedData "The data cannot be decoded as UTF-8."
DataFormatConversion.MalformedData "Illegal character found between tokens."
DataFormatConversion.InvalidTypeFormat "The type format is invalid. Check the type syntax."
DataFormatConversion.InvalidSchema "Invalid Schema. Please ensure that there are no special characters or white spaces in column names."
DataFormatConversion.InvalidRecord "Record is not as per schema. One or more map keys were invalid for map<string,string>."
DataFormatConversion.MalformedData "The input JSON contained a primitive at the top level. The top level must be an object or array."
DataFormatConversion.MalformedData "The input JSON contained a primitive at the top level. The top level must be an object or array."
DataFormatConversion.MalformedData "The record was empty or contained only whitespace."
DataFormatConversion.MalformedData "Encountered invalid characters."
DataFormatConversion.MalformedData "Encountered invalid or unsupported timestamp format. Please see the Firehose developer guide for supported timestamp formats."
DataFormatConversion.MalformedData "A scalar type was found in the data but a complex type was specified on the schema."
DataFormatConversion.MalformedData "Data does not match the schema."
DataFormatConversion.MalformedData "A scalar type was found in the data but a complex type was specified on the schema."
DataFormatConversion.ConversionFailureException "ConversionFailureException"
DataFormatConversion.DataFormatConversionCustomerErrorException "DataFormatConversionCustomerErrorException"
DataFormatConversion.DataFormatConversionCustomerErrorException "DataFormatConversionCustomerErrorException"
DataFormatConversion.MalformedData "Data does not match the schema."
DataFormatConversion.InvalidSchema "The schema is invalid."
DataFormatConversion.MalformedData "Data does not match the schema. Invalid format for one or more dates."
DataFormatConversion.MalformedData "Data contains a highly nested JSON structure that is not supported."
DataFormatConversion.EntityNotFound "The specified table/database could not be found. Please ensure that the table/database exists and that the values provided in the schema configuration are correct, especially with regards to casing."
DataFormatConversion.InvalidInput "Could not find a matching schema from glue. Please make sure the specified database with the supplied catalog ID exists."
DataFormatConversion.InvalidInput "Could not find a matching schema from glue. Please make sure the passed ARN is in the correct format."
DataFormatConversion.InvalidInput "Could not find a matching schema from glue. Please make sure the catalogId provided is valid."
DataFormatConversion.InvalidVersionId "Could not find a matching schema from glue. Please make sure the specified version of the table exists."
DataFormatConversion.NonExistentColumns "Could not find a matching schema from glue. Please make sure the table is configured with a non-null storage descriptor containing the target columns."
DataFormatConversion.AccessDenied "Access was denied when assuming role. Please ensure that the role specified in the data format conversion configuration has granted the Firehose service permission to assume it."
DataFormatConversion.ThrottledByGlue "Throttling error encountered when calling Glue. Either increase the request rate limit or reduce the current rate of calling glue through other applications."
DataFormatConversion.AccessDenied "Access was denied when calling Glue. Please ensure that the role specified in the data format conversion configuration has the necessary permissions."
DataFormatConversion.InvalidGlueRole "Invalid role. Please ensure that the role specified in the data format conversion configuration exists."
DataFormatConversion.GlueNotAvailableInRegion "AWS Glue is not yet available in the region you have specified; please specify a different region."
DataFormatConversion.GlueEncryptionException "There was an error retrieving the master key. Ensure that the key exists and has the correct access permissions."
DataFormatConversion.SchemaValidationTimeout "Timed out while retrieving table from Glue. If you have a large number of Glue table versions, please add 'glue:GetTableVersion' permission (recommended) or delete unused table versions. If you do not have a large number of tables in Glue, please contact AWS Support."
DataFirehose.InternalError "Timed out while retrieving table from Glue. If you have a large number of Glue table versions, please add 'glue:GetTableVersion' permission (recommended) or delete unused table versions. If you do not have a large number of tables in Glue, please contact AWS Support."
DataFormatConversion.MalformedData "One or more fields have incorrect format."