Exemples d'utilisation de Firehose pour SDK Python (Boto3) - Exemples de code de l'AWS SDK

D'autres AWS SDK exemples sont disponibles dans le GitHub dépôt AWS Doc SDK Examples.

Les traductions sont fournies par des outils de traduction automatique. En cas de conflit entre le contenu d'une traduction et celui de la version originale en anglais, la version anglaise prévaudra.

Exemples d'utilisation de Firehose pour SDK Python (Boto3)

Les exemples de code suivants vous montrent comment effectuer des actions et implémenter des scénarios courants à l' AWS SDK for Python (Boto3) aide de Firehose.

Les actions sont des extraits de code de programmes plus larges et doivent être exécutées dans leur contexte. Alors que les actions vous montrent comment appeler des fonctions de service individuelles, vous pouvez les visualiser dans leur contexte dans leurs scénarios associés.

Les scénarios sont des exemples de code qui vous montrent comment accomplir des tâches spécifiques en appelant plusieurs fonctions au sein d'un service ou en les combinant à d'autres Services AWS.

Chaque exemple inclut un lien vers le code source complet, où vous trouverez des instructions sur la façon de configurer et d'exécuter le code en contexte.

Actions

L'exemple de code suivant montre comment utiliserPutRecord.

SDKpour Python (Boto3)
Note

Il y en a plus à ce sujet GitHub. Trouvez l’exemple complet et découvrez comment le configurer et l’exécuter dans le référentiel d’exemples de code AWS.

class FirehoseClient: """ AWS Firehose client to send records and monitor metrics. Attributes: config (object): Configuration object with delivery stream name and region. delivery_stream_name (str): Name of the Firehose delivery stream. region (str): AWS region for Firehose and CloudWatch clients. firehose (boto3.client): Boto3 Firehose client. cloudwatch (boto3.client): Boto3 CloudWatch client. """ def __init__(self, config): """ Initialize the FirehoseClient. Args: config (object): Configuration object with delivery stream name and region. """ self.config = config self.delivery_stream_name = config.delivery_stream_name self.region = config.region self.firehose = boto3.client("firehose", region_name=self.region) self.cloudwatch = boto3.client("cloudwatch", region_name=self.region) @backoff.on_exception( backoff.expo, Exception, max_tries=5, jitter=backoff.full_jitter ) def put_record(self, record: dict): """ Put individual records to Firehose with backoff and retry. Args: record (dict): The data record to be sent to Firehose. This method attempts to send an individual record to the Firehose delivery stream. It retries with exponential backoff in case of exceptions. """ try: entry = self._create_record_entry(record) response = self.firehose.put_record( DeliveryStreamName=self.delivery_stream_name, Record=entry ) self._log_response(response, entry) except Exception: logger.info(f"Fail record: {record}.") raise
  • Pour API plus de détails, reportez-vous PutRecordà la section AWS SDKpour la référence Python (Boto3). API

L'exemple de code suivant montre comment utiliserPutRecordBatch.

SDKpour Python (Boto3)
Note

Il y en a plus à ce sujet GitHub. Trouvez l’exemple complet et découvrez comment le configurer et l’exécuter dans le référentiel d’exemples de code AWS.

class FirehoseClient: """ AWS Firehose client to send records and monitor metrics. Attributes: config (object): Configuration object with delivery stream name and region. delivery_stream_name (str): Name of the Firehose delivery stream. region (str): AWS region for Firehose and CloudWatch clients. firehose (boto3.client): Boto3 Firehose client. cloudwatch (boto3.client): Boto3 CloudWatch client. """ def __init__(self, config): """ Initialize the FirehoseClient. Args: config (object): Configuration object with delivery stream name and region. """ self.config = config self.delivery_stream_name = config.delivery_stream_name self.region = config.region self.firehose = boto3.client("firehose", region_name=self.region) self.cloudwatch = boto3.client("cloudwatch", region_name=self.region) @backoff.on_exception( backoff.expo, Exception, max_tries=5, jitter=backoff.full_jitter ) def put_record_batch(self, data: list, batch_size: int = 500): """ Put records in batches to Firehose with backoff and retry. Args: data (list): List of data records to be sent to Firehose. batch_size (int): Number of records to send in each batch. Default is 500. This method attempts to send records in batches to the Firehose delivery stream. It retries with exponential backoff in case of exceptions. """ for i in range(0, len(data), batch_size): batch = data[i : i + batch_size] record_dicts = [{"Data": json.dumps(record)} for record in batch] try: response = self.firehose.put_record_batch( DeliveryStreamName=self.delivery_stream_name, Records=record_dicts ) self._log_batch_response(response, len(batch)) except Exception as e: logger.info(f"Failed to send batch of {len(batch)} records. Error: {e}")
  • Pour API plus de détails, reportez-vous PutRecordBatchà la section AWS SDKpour la référence Python (Boto3). API

Scénarios

L'exemple de code suivant montre comment utiliser Firehose pour traiter des enregistrements individuels et par lots.

SDKpour Python (Boto3)
Note

Il y en a plus à ce sujet GitHub. Trouvez l’exemple complet et découvrez comment le configurer et l’exécuter dans le référentiel d’exemples de code AWS.

Ce script place des enregistrements individuels et par lots dans Firehose.

import json import logging import random from datetime import datetime, timedelta import backoff import boto3 from config import get_config def load_sample_data(path: str) -> dict: """ Load sample data from a JSON file. Args: path (str): The file path to the JSON file containing sample data. Returns: dict: The loaded sample data as a dictionary. """ with open(path, "r") as f: return json.load(f) # Configure logging logging.basicConfig(level=logging.INFO) logger = logging.getLogger(__name__) class FirehoseClient: """ AWS Firehose client to send records and monitor metrics. Attributes: config (object): Configuration object with delivery stream name and region. delivery_stream_name (str): Name of the Firehose delivery stream. region (str): AWS region for Firehose and CloudWatch clients. firehose (boto3.client): Boto3 Firehose client. cloudwatch (boto3.client): Boto3 CloudWatch client. """ def __init__(self, config): """ Initialize the FirehoseClient. Args: config (object): Configuration object with delivery stream name and region. """ self.config = config self.delivery_stream_name = config.delivery_stream_name self.region = config.region self.firehose = boto3.client("firehose", region_name=self.region) self.cloudwatch = boto3.client("cloudwatch", region_name=self.region) @backoff.on_exception( backoff.expo, Exception, max_tries=5, jitter=backoff.full_jitter ) def put_record(self, record: dict): """ Put individual records to Firehose with backoff and retry. Args: record (dict): The data record to be sent to Firehose. This method attempts to send an individual record to the Firehose delivery stream. It retries with exponential backoff in case of exceptions. """ try: entry = self._create_record_entry(record) response = self.firehose.put_record( DeliveryStreamName=self.delivery_stream_name, Record=entry ) self._log_response(response, entry) except Exception: logger.info(f"Fail record: {record}.") raise @backoff.on_exception( backoff.expo, Exception, max_tries=5, jitter=backoff.full_jitter ) def put_record_batch(self, data: list, batch_size: int = 500): """ Put records in batches to Firehose with backoff and retry. Args: data (list): List of data records to be sent to Firehose. batch_size (int): Number of records to send in each batch. Default is 500. This method attempts to send records in batches to the Firehose delivery stream. It retries with exponential backoff in case of exceptions. """ for i in range(0, len(data), batch_size): batch = data[i : i + batch_size] record_dicts = [{"Data": json.dumps(record)} for record in batch] try: response = self.firehose.put_record_batch( DeliveryStreamName=self.delivery_stream_name, Records=record_dicts ) self._log_batch_response(response, len(batch)) except Exception as e: logger.info(f"Failed to send batch of {len(batch)} records. Error: {e}") def get_metric_statistics( self, metric_name: str, start_time: datetime, end_time: datetime, period: int, statistics: list = ["Sum"], ) -> list: """ Retrieve metric statistics from CloudWatch. Args: metric_name (str): The name of the metric. start_time (datetime): The start time for the metric statistics. end_time (datetime): The end time for the metric statistics. period (int): The granularity, in seconds, of the returned data points. statistics (list): A list of statistics to retrieve. Default is ['Sum']. Returns: list: List of datapoints containing the metric statistics. """ response = self.cloudwatch.get_metric_statistics( Namespace="AWS/Firehose", MetricName=metric_name, Dimensions=[ {"Name": "DeliveryStreamName", "Value": self.delivery_stream_name}, ], StartTime=start_time, EndTime=end_time, Period=period, Statistics=statistics, ) return response["Datapoints"] def monitor_metrics(self): """ Monitor Firehose metrics for the last 5 minutes. This method retrieves and logs the 'IncomingBytes', 'IncomingRecords', and 'FailedPutCount' metrics from CloudWatch for the last 5 minutes. """ end_time = datetime.utcnow() start_time = end_time - timedelta(minutes=10) period = int((end_time - start_time).total_seconds()) metrics = { "IncomingBytes": self.get_metric_statistics( "IncomingBytes", start_time, end_time, period ), "IncomingRecords": self.get_metric_statistics( "IncomingRecords", start_time, end_time, period ), "FailedPutCount": self.get_metric_statistics( "FailedPutCount", start_time, end_time, period ), } for metric, datapoints in metrics.items(): if datapoints: total_sum = sum(datapoint["Sum"] for datapoint in datapoints) if metric == "IncomingBytes": logger.info( f"{metric}: {round(total_sum)} ({total_sum / (1024 * 1024):.2f} MB)" ) else: logger.info(f"{metric}: {round(total_sum)}") else: logger.info(f"No data found for {metric} over the last 5 minutes") def _create_record_entry(self, record: dict) -> dict: """ Create a record entry for Firehose. Args: record (dict): The data record to be sent. Returns: dict: The record entry formatted for Firehose. Raises: Exception: If a simulated network error occurs. """ if random.random() < 0.2: raise Exception("Simulated network error") elif random.random() < 0.1: return {"Data": '{"malformed": "data"'} else: return {"Data": json.dumps(record)} def _log_response(self, response: dict, entry: dict): """ Log the response from Firehose. Args: response (dict): The response from the Firehose put_record API call. entry (dict): The record entry that was sent. """ if response["ResponseMetadata"]["HTTPStatusCode"] == 200: logger.info(f"Sent record: {entry}") else: logger.info(f"Fail record: {entry}") def _log_batch_response(self, response: dict, batch_size: int): """ Log the batch response from Firehose. Args: response (dict): The response from the Firehose put_record_batch API call. batch_size (int): The number of records in the batch. """ if response.get("FailedPutCount", 0) > 0: logger.info( f'Failed to send {response["FailedPutCount"]} records in batch of {batch_size}' ) else: logger.info(f"Successfully sent batch of {batch_size} records") if __name__ == "__main__": config = get_config() data = load_sample_data(config.sample_data_file) client = FirehoseClient(config) # Process the first 100 sample network records for record in data[:100]: try: client.put_record(record) except Exception as e: logger.info(f"Put record failed after retries and backoff: {e}") client.monitor_metrics() # Process remaining records using the batch method try: client.put_record_batch(data[100:]) except Exception as e: logger.info(f"Put record batch failed after retries and backoff: {e}") client.monitor_metrics()

Ce fichier contient la configuration du script ci-dessus.

class Config: def __init__(self): self.delivery_stream_name = "ENTER YOUR DELIVERY STREAM NAME HERE" self.region = "us-east-1" self.sample_data_file = ( "../../../../../workflows/firehose/resources/sample_records.json" ) def get_config(): return Config()
  • Pour API plus de détails, consultez les rubriques suivantes dans le AWS SDKdocument de référence Python (Boto3). API