Select your cookie preferences

We use essential cookies and similar tools that are necessary to provide our site and services. We use performance cookies to collect anonymous statistics, so we can understand how customers use our site and make improvements. Essential cookies cannot be deactivated, but you can choose “Customize” or “Decline” to decline performance cookies.

If you agree, AWS and approved third parties will also use cookies to provide useful site features, remember your preferences, and display relevant content, including relevant advertising. To accept or decline all non-essential cookies, choose “Accept” or “Decline.” To make more detailed choices, choose “Customize.”

Send data to the MSK cluster

Focus mode
Send data to the MSK cluster - Amazon Managed Streaming for Apache Kafka

In this step you send data to the Apache Kafka topic that you created earlier, and then look for that same data in the destination S3 bucket.

To send data to the MSK cluster
  1. In the bin folder of the Apache Kafka installation on the client instance, create a text file named client.properties with the following contents.

    security.protocol=SASL_SSL sasl.mechanism=AWS_MSK_IAM
  2. Run the following command to create a console producer. Replace BootstrapBrokerString with the value that you obtained when you ran the previous command.

    <path-to-your-kafka-installation>/bin/kafka-console-producer.sh --broker-list BootstrapBrokerString --producer.config client.properties --topic mkc-tutorial-topic
  3. Enter any message that you want, and press Enter. Repeat this step two or three times. Every time you enter a line and press Enter, that line is sent to your Apache Kafka cluster as a separate message.

  4. Look in the destination Amazon S3 bucket to find the messages that you sent in the previous step.

PrivacySite termsCookie preferences
© 2025, Amazon Web Services, Inc. or its affiliates. All rights reserved.