interface PipeSourceParametersProperty
| Language | Type name | 
|---|---|
|  .NET | Amazon.CDK.AWS.Pipes.CfnPipe.PipeSourceParametersProperty | 
|  Go | github.com/aws/aws-cdk-go/awscdk/v2/awspipes#CfnPipe_PipeSourceParametersProperty | 
|  Java | software.amazon.awscdk.services.pipes.CfnPipe.PipeSourceParametersProperty | 
|  Python | aws_cdk.aws_pipes.CfnPipe.PipeSourceParametersProperty | 
|  TypeScript | aws-cdk-lib»aws_pipes»CfnPipe»PipeSourceParametersProperty | 
The parameters required to set up a source for your pipe.
Example
// The code below shows an example of how to instantiate this type.
// The values are placeholders you should change.
import { aws_pipes as pipes } from 'aws-cdk-lib';
const pipeSourceParametersProperty: pipes.CfnPipe.PipeSourceParametersProperty = {
  activeMqBrokerParameters: {
    credentials: {
      basicAuth: 'basicAuth',
    },
    queueName: 'queueName',
    // the properties below are optional
    batchSize: 123,
    maximumBatchingWindowInSeconds: 123,
  },
  dynamoDbStreamParameters: {
    startingPosition: 'startingPosition',
    // the properties below are optional
    batchSize: 123,
    deadLetterConfig: {
      arn: 'arn',
    },
    maximumBatchingWindowInSeconds: 123,
    maximumRecordAgeInSeconds: 123,
    maximumRetryAttempts: 123,
    onPartialBatchItemFailure: 'onPartialBatchItemFailure',
    parallelizationFactor: 123,
  },
  filterCriteria: {
    filters: [{
      pattern: 'pattern',
    }],
  },
  kinesisStreamParameters: {
    startingPosition: 'startingPosition',
    // the properties below are optional
    batchSize: 123,
    deadLetterConfig: {
      arn: 'arn',
    },
    maximumBatchingWindowInSeconds: 123,
    maximumRecordAgeInSeconds: 123,
    maximumRetryAttempts: 123,
    onPartialBatchItemFailure: 'onPartialBatchItemFailure',
    parallelizationFactor: 123,
    startingPositionTimestamp: 'startingPositionTimestamp',
  },
  managedStreamingKafkaParameters: {
    topicName: 'topicName',
    // the properties below are optional
    batchSize: 123,
    consumerGroupId: 'consumerGroupId',
    credentials: {
      clientCertificateTlsAuth: 'clientCertificateTlsAuth',
      saslScram512Auth: 'saslScram512Auth',
    },
    maximumBatchingWindowInSeconds: 123,
    startingPosition: 'startingPosition',
  },
  rabbitMqBrokerParameters: {
    credentials: {
      basicAuth: 'basicAuth',
    },
    queueName: 'queueName',
    // the properties below are optional
    batchSize: 123,
    maximumBatchingWindowInSeconds: 123,
    virtualHost: 'virtualHost',
  },
  selfManagedKafkaParameters: {
    topicName: 'topicName',
    // the properties below are optional
    additionalBootstrapServers: ['additionalBootstrapServers'],
    batchSize: 123,
    consumerGroupId: 'consumerGroupId',
    credentials: {
      basicAuth: 'basicAuth',
      clientCertificateTlsAuth: 'clientCertificateTlsAuth',
      saslScram256Auth: 'saslScram256Auth',
      saslScram512Auth: 'saslScram512Auth',
    },
    maximumBatchingWindowInSeconds: 123,
    serverRootCaCertificate: 'serverRootCaCertificate',
    startingPosition: 'startingPosition',
    vpc: {
      securityGroup: ['securityGroup'],
      subnets: ['subnets'],
    },
  },
  sqsQueueParameters: {
    batchSize: 123,
    maximumBatchingWindowInSeconds: 123,
  },
};
Properties
| Name | Type | Description | 
|---|---|---|
| active | IResolvable | Pipe | The parameters for using an Active MQ broker as a source. | 
| dynamo | IResolvable | Pipe | The parameters for using a DynamoDB stream as a source. | 
| filter | IResolvable | Filter | The collection of event patterns used to filter events. | 
| kinesis | IResolvable | Pipe | The parameters for using a Kinesis stream as a source. | 
| managed | IResolvable | Pipe | The parameters for using an MSK stream as a source. | 
| rabbit | IResolvable | Pipe | The parameters for using a Rabbit MQ broker as a source. | 
| self | IResolvable | Pipe | The parameters for using a self-managed Apache Kafka stream as a source. | 
| sqs | IResolvable | Pipe | The parameters for using a Amazon SQS stream as a source. | 
activeMqBrokerParameters?
Type:
IResolvable | Pipe
(optional)
The parameters for using an Active MQ broker as a source.
dynamoDbStreamParameters?
Type:
IResolvable | Pipe
(optional)
The parameters for using a DynamoDB stream as a source.
filterCriteria?
Type:
IResolvable | Filter
(optional)
The collection of event patterns used to filter events.
To remove a filter, specify a FilterCriteria object with an empty array of Filter objects.
For more information, see Events and Event Patterns in the Amazon EventBridge User Guide .
kinesisStreamParameters?
Type:
IResolvable | Pipe
(optional)
The parameters for using a Kinesis stream as a source.
managedStreamingKafkaParameters?
Type:
IResolvable | Pipe
(optional)
The parameters for using an MSK stream as a source.
rabbitMqBrokerParameters?
Type:
IResolvable | Pipe
(optional)
The parameters for using a Rabbit MQ broker as a source.
selfManagedKafkaParameters?
Type:
IResolvable | Pipe
(optional)
The parameters for using a self-managed Apache Kafka stream as a source.
A self managed cluster refers to any Apache Kafka cluster not hosted by AWS . This includes both clusters you manage yourself, as well as those hosted by a third-party provider, such as Confluent Cloud , CloudKarafka , or Redpanda . For more information, see Apache Kafka streams as a source in the Amazon EventBridge User Guide .
sqsQueueParameters?
Type:
IResolvable | Pipe
(optional)
The parameters for using a Amazon SQS stream as a source.
