CfnEnvironmentProps
- class aws_cdk.aws_mwaa.CfnEnvironmentProps(*, name, airflow_configuration_options=None, airflow_version=None, dag_s3_path=None, endpoint_management=None, environment_class=None, execution_role_arn=None, kms_key=None, logging_configuration=None, max_webservers=None, max_workers=None, min_webservers=None, min_workers=None, network_configuration=None, plugins_s3_object_version=None, plugins_s3_path=None, requirements_s3_object_version=None, requirements_s3_path=None, schedulers=None, source_bucket_arn=None, startup_script_s3_object_version=None, startup_script_s3_path=None, tags=None, webserver_access_mode=None, weekly_maintenance_window_start=None)
Bases:
object
Properties for defining a
CfnEnvironment
.- Parameters:
name (
str
) – The name of your Amazon MWAA environment.airflow_configuration_options (
Any
) – A list of key-value pairs containing the Airflow configuration options for your environment. For example,core.default_timezone: utc
. To learn more, see Apache Airflow configuration options .airflow_version (
Optional
[str
]) – The version of Apache Airflow to use for the environment. If no value is specified, defaults to the latest version. If you specify a newer version number for an existing environment, the version update requires some service interruption before taking effect. Allowed Values :1.10.12
|2.0.2
|2.2.2
|2.4.3
|2.5.1
|2.6.3
|2.7.2
|2.8.1
|2.9.2
(latest)dag_s3_path (
Optional
[str
]) – The relative path to the DAGs folder on your Amazon S3 bucket. For example,dags
. To learn more, see Adding or updating DAGs .endpoint_management (
Optional
[str
]) – Defines whether the VPC endpoints configured for the environment are created, and managed, by the customer or by Amazon MWAA. If set toSERVICE
, Amazon MWAA will create and manage the required VPC endpoints in your VPC. If set toCUSTOMER
, you must create, and manage, the VPC endpoints in your VPC.environment_class (
Optional
[str
]) – The environment class type. Valid values:mw1.small
,mw1.medium
,mw1.large
. To learn more, see Amazon MWAA environment class .execution_role_arn (
Optional
[str
]) – The Amazon Resource Name (ARN) of the execution role in IAM that allows MWAA to access AWS resources in your environment. For example,arn:aws:iam::123456789:role/my-execution-role
. To learn more, see Amazon MWAA Execution role .kms_key (
Optional
[str
]) – The AWS Key Management Service (KMS) key to encrypt and decrypt the data in your environment. You can use an AWS KMS key managed by MWAA, or a customer-managed KMS key (advanced).logging_configuration (
Union
[IResolvable
,LoggingConfigurationProperty
,Dict
[str
,Any
],None
]) – The Apache Airflow logs being sent to CloudWatch Logs:DagProcessingLogs
,SchedulerLogs
,TaskLogs
,WebserverLogs
,WorkerLogs
.max_webservers (
Union
[int
,float
,None
]) – The maximum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify forMaxWebservers
when you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. For example, in scenarios where your workload requires network calls to the Apache Airflow REST API with a high transaction-per-second (TPS) rate, Amazon MWAA will increase the number of web servers up to the number set inMaxWebserers
. As TPS rates decrease Amazon MWAA disposes of the additional web servers, and scales down to the number set inMinxWebserers
. Valid values: For environments larger than mw1.micro, accepts values from2
to5
. Defaults to2
for all environment sizes except mw1.micro, which defaults to1
.max_workers (
Union
[int
,float
,None
]) – The maximum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in theMaxWorkers
field. For example,20
. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the one worker that is included with your environment, or the number you specify inMinWorkers
.min_webservers (
Union
[int
,float
,None
]) – The minimum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify forMaxWebservers
when you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. As the transaction-per-second rate, and the network load, decrease, Amazon MWAA disposes of the additional web servers, and scales down to the number set inMinxWebserers
. Valid values: For environments larger than mw1.micro, accepts values from2
to5
. Defaults to2
for all environment sizes except mw1.micro, which defaults to1
.min_workers (
Union
[int
,float
,None
]) – The minimum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in theMaxWorkers
field. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the worker count you specify in theMinWorkers
field. For example,2
.network_configuration (
Union
[IResolvable
,NetworkConfigurationProperty
,Dict
[str
,Any
],None
]) – The VPC networking components used to secure and enable network traffic between the AWS resources for your environment. To learn more, see About networking on Amazon MWAA .plugins_s3_object_version (
Optional
[str
]) – The version of the plugins.zip file on your Amazon S3 bucket. To learn more, see Installing custom plugins .plugins_s3_path (
Optional
[str
]) –The relative path to the
plugins.zip
file on your Amazon S3 bucket. For example,plugins.zip
. To learn more, see Installing custom plugins .requirements_s3_object_version (
Optional
[str
]) – The version of the requirements.txt file on your Amazon S3 bucket. To learn more, see Installing Python dependencies .requirements_s3_path (
Optional
[str
]) –The relative path to the
requirements.txt
file on your Amazon S3 bucket. For example,requirements.txt
. To learn more, see Installing Python dependencies .schedulers (
Union
[int
,float
,None
]) – The number of schedulers that you want to run in your environment. Valid values:. - v2 - Accepts between 2 to 5. Defaults to 2. - v1 - Accepts 1.source_bucket_arn (
Optional
[str
]) – The Amazon Resource Name (ARN) of the Amazon S3 bucket where your DAG code and supporting files are stored. For example,arn:aws:s3:::my-airflow-bucket-unique-name
. To learn more, see Create an Amazon S3 bucket for Amazon MWAA .startup_script_s3_object_version (
Optional
[str
]) – The version of the startup shell script in your Amazon S3 bucket. You must specify the version ID that Amazon S3 assigns to the file every time you update the script. Version IDs are Unicode, UTF-8 encoded, URL-ready, opaque strings that are no more than 1,024 bytes long. The following is an example:3sL4kqtJlcpXroDTDmJ+rmSpXd3dIbrHY+MTRCxf3vjVBH40Nr8X8gdRQBpUMLUo
For more information, see Using a startup script .startup_script_s3_path (
Optional
[str
]) –The relative path to the startup shell script in your Amazon S3 bucket. For example,
s3://mwaa-environment/startup.sh
. Amazon MWAA runs the script as your environment starts, and before running the Apache Airflow process. You can use this script to install dependencies, modify Apache Airflow configuration options, and set environment variables. For more information, see Using a startup script .tags (
Any
) – The key-value tag pairs associated to your environment. For example,"Environment": "Staging"
. To learn more, see Tagging . If you specify new tags for an existing environment, the update requires service interruption before taking effect.webserver_access_mode (
Optional
[str
]) – The Apache Airflow Web server access mode. To learn more, see Apache Airflow access modes . Valid values:PRIVATE_ONLY
orPUBLIC_ONLY
.weekly_maintenance_window_start (
Optional
[str
]) – The day and time of the week to start weekly maintenance updates of your environment in the following format:DAY:HH:MM
. For example:TUE:03:30
. You can specify a start time in 30 minute increments only. Supported input includes the following: - MON|TUE|WED|THU|FRI|SAT|SUN:([01]d|2[0-3]):(00|30)
- See:
http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-mwaa-environment.html
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_mwaa as mwaa # airflow_configuration_options: Any # tags: Any cfn_environment_props = mwaa.CfnEnvironmentProps( name="name", # the properties below are optional airflow_configuration_options=airflow_configuration_options, airflow_version="airflowVersion", dag_s3_path="dagS3Path", endpoint_management="endpointManagement", environment_class="environmentClass", execution_role_arn="executionRoleArn", kms_key="kmsKey", logging_configuration=mwaa.CfnEnvironment.LoggingConfigurationProperty( dag_processing_logs=mwaa.CfnEnvironment.ModuleLoggingConfigurationProperty( cloud_watch_log_group_arn="cloudWatchLogGroupArn", enabled=False, log_level="logLevel" ), scheduler_logs=mwaa.CfnEnvironment.ModuleLoggingConfigurationProperty( cloud_watch_log_group_arn="cloudWatchLogGroupArn", enabled=False, log_level="logLevel" ), task_logs=mwaa.CfnEnvironment.ModuleLoggingConfigurationProperty( cloud_watch_log_group_arn="cloudWatchLogGroupArn", enabled=False, log_level="logLevel" ), webserver_logs=mwaa.CfnEnvironment.ModuleLoggingConfigurationProperty( cloud_watch_log_group_arn="cloudWatchLogGroupArn", enabled=False, log_level="logLevel" ), worker_logs=mwaa.CfnEnvironment.ModuleLoggingConfigurationProperty( cloud_watch_log_group_arn="cloudWatchLogGroupArn", enabled=False, log_level="logLevel" ) ), max_webservers=123, max_workers=123, min_webservers=123, min_workers=123, network_configuration=mwaa.CfnEnvironment.NetworkConfigurationProperty( security_group_ids=["securityGroupIds"], subnet_ids=["subnetIds"] ), plugins_s3_object_version="pluginsS3ObjectVersion", plugins_s3_path="pluginsS3Path", requirements_s3_object_version="requirementsS3ObjectVersion", requirements_s3_path="requirementsS3Path", schedulers=123, source_bucket_arn="sourceBucketArn", startup_script_s3_object_version="startupScriptS3ObjectVersion", startup_script_s3_path="startupScriptS3Path", tags=tags, webserver_access_mode="webserverAccessMode", weekly_maintenance_window_start="weeklyMaintenanceWindowStart" )
Attributes
- airflow_configuration_options
A list of key-value pairs containing the Airflow configuration options for your environment.
For example,
core.default_timezone: utc
. To learn more, see Apache Airflow configuration options .
- airflow_version
The version of Apache Airflow to use for the environment.
If no value is specified, defaults to the latest version.
If you specify a newer version number for an existing environment, the version update requires some service interruption before taking effect.
Allowed Values :
1.10.12
|2.0.2
|2.2.2
|2.4.3
|2.5.1
|2.6.3
|2.7.2
|2.8.1
|2.9.2
(latest)
- dag_s3_path
The relative path to the DAGs folder on your Amazon S3 bucket.
For example,
dags
. To learn more, see Adding or updating DAGs .
- endpoint_management
Defines whether the VPC endpoints configured for the environment are created, and managed, by the customer or by Amazon MWAA.
If set to
SERVICE
, Amazon MWAA will create and manage the required VPC endpoints in your VPC. If set toCUSTOMER
, you must create, and manage, the VPC endpoints in your VPC.
- environment_class
The environment class type.
Valid values:
mw1.small
,mw1.medium
,mw1.large
. To learn more, see Amazon MWAA environment class .
- execution_role_arn
The Amazon Resource Name (ARN) of the execution role in IAM that allows MWAA to access AWS resources in your environment.
For example,
arn:aws:iam::123456789:role/my-execution-role
. To learn more, see Amazon MWAA Execution role .
- kms_key
The AWS Key Management Service (KMS) key to encrypt and decrypt the data in your environment.
You can use an AWS KMS key managed by MWAA, or a customer-managed KMS key (advanced).
- logging_configuration
DagProcessingLogs
,SchedulerLogs
,TaskLogs
,WebserverLogs
,WorkerLogs
.- See:
- Type:
The Apache Airflow logs being sent to CloudWatch Logs
- max_webservers
The maximum number of web servers that you want to run in your environment.
Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for
MaxWebservers
when you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. For example, in scenarios where your workload requires network calls to the Apache Airflow REST API with a high transaction-per-second (TPS) rate, Amazon MWAA will increase the number of web servers up to the number set inMaxWebserers
. As TPS rates decrease Amazon MWAA disposes of the additional web servers, and scales down to the number set inMinxWebserers
.Valid values: For environments larger than mw1.micro, accepts values from
2
to5
. Defaults to2
for all environment sizes except mw1.micro, which defaults to1
.
- max_workers
The maximum number of workers that you want to run in your environment.
MWAA scales the number of Apache Airflow workers up to the number you specify in the
MaxWorkers
field. For example,20
. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the one worker that is included with your environment, or the number you specify inMinWorkers
.
- min_webservers
The minimum number of web servers that you want to run in your environment.
Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for
MaxWebservers
when you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. As the transaction-per-second rate, and the network load, decrease, Amazon MWAA disposes of the additional web servers, and scales down to the number set inMinxWebserers
.Valid values: For environments larger than mw1.micro, accepts values from
2
to5
. Defaults to2
for all environment sizes except mw1.micro, which defaults to1
.
- min_workers
The minimum number of workers that you want to run in your environment.
MWAA scales the number of Apache Airflow workers up to the number you specify in the
MaxWorkers
field. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the worker count you specify in theMinWorkers
field. For example,2
.
- name
The name of your Amazon MWAA environment.
- network_configuration
The VPC networking components used to secure and enable network traffic between the AWS resources for your environment.
To learn more, see About networking on Amazon MWAA .
- plugins_s3_object_version
//docs.aws.amazon.com/mwaa/latest/userguide/configuring-dag-import-plugins.html>`_ .
- See:
- Type:
The version of the plugins.zip file on your Amazon S3 bucket. To learn more, see `Installing custom plugins <https
- plugins_s3_path
//docs.aws.amazon.com/mwaa/latest/userguide/configuring-dag-import-plugins.html>`_ .
- See:
- Type:
The relative path to the
plugins.zip
file on your Amazon S3 bucket. For example,plugins.zip
. To learn more, see `Installing custom plugins <https
- requirements_s3_object_version
//docs.aws.amazon.com/mwaa/latest/userguide/working-dags-dependencies.html>`_ .
- See:
- Type:
The version of the requirements.txt file on your Amazon S3 bucket. To learn more, see `Installing Python dependencies <https
- requirements_s3_path
//docs.aws.amazon.com/mwaa/latest/userguide/working-dags-dependencies.html>`_ .
- See:
- Type:
The relative path to the
requirements.txt
file on your Amazon S3 bucket. For example,requirements.txt
. To learn more, see `Installing Python dependencies <https
- schedulers
.
v2 - Accepts between 2 to 5. Defaults to 2.
v1 - Accepts 1.
- See:
- Type:
The number of schedulers that you want to run in your environment. Valid values
- source_bucket_arn
The Amazon Resource Name (ARN) of the Amazon S3 bucket where your DAG code and supporting files are stored.
For example,
arn:aws:s3:::my-airflow-bucket-unique-name
. To learn more, see Create an Amazon S3 bucket for Amazon MWAA .
- startup_script_s3_object_version
The version of the startup shell script in your Amazon S3 bucket.
You must specify the version ID that Amazon S3 assigns to the file every time you update the script.
Version IDs are Unicode, UTF-8 encoded, URL-ready, opaque strings that are no more than 1,024 bytes long. The following is an example:
3sL4kqtJlcpXroDTDmJ+rmSpXd3dIbrHY+MTRCxf3vjVBH40Nr8X8gdRQBpUMLUo
For more information, see Using a startup script .
- startup_script_s3_path
The relative path to the startup shell script in your Amazon S3 bucket. For example,
s3://mwaa-environment/startup.sh
.Amazon MWAA runs the script as your environment starts, and before running the Apache Airflow process. You can use this script to install dependencies, modify Apache Airflow configuration options, and set environment variables. For more information, see Using a startup script .
- tags
//docs.aws.amazon.com/general/latest/gr/aws_tagging.html>`_ .
If you specify new tags for an existing environment, the update requires service interruption before taking effect.
- See:
- Type:
The key-value tag pairs associated to your environment. For example,
"Environment": "Staging"
. To learn more, see `Tagging <https
- webserver_access_mode
The Apache Airflow Web server access mode.
To learn more, see Apache Airflow access modes . Valid values:
PRIVATE_ONLY
orPUBLIC_ONLY
.
- weekly_maintenance_window_start
DAY:HH:MM
.For example:
TUE:03:30
. You can specify a start time in 30 minute increments only. Supported input includes the following:MON|TUE|WED|THU|FRI|SAT|SUN:([01]d|2[0-3]):(00|30)
- See:
- Type:
The day and time of the week to start weekly maintenance updates of your environment in the following format