选择您的 Cookie 首选项

我们使用必要 Cookie 和类似工具提供我们的网站和服务。我们使用性能 Cookie 收集匿名统计数据,以便我们可以了解客户如何使用我们的网站并进行改进。必要 Cookie 无法停用,但您可以单击“自定义”或“拒绝”来拒绝性能 Cookie。

如果您同意,AWS 和经批准的第三方还将使用 Cookie 提供有用的网站功能、记住您的首选项并显示相关内容,包括相关广告。要接受或拒绝所有非必要 Cookie,请单击“接受”或“拒绝”。要做出更详细的选择,请单击“自定义”。

Use ListFHIRImportJobs with an AWS SDK or CLI

聚焦模式
Use ListFHIRImportJobs with an AWS SDK or CLI - AWS HealthLake
此页面尚未翻译为您的语言。 请求翻译

The following code examples show how to use ListFHIRImportJobs.

CLI
AWS CLI

To list all FHIR import jobs

The following list-fhir-import-jobs example shows how to use the command to view a list of all import jobs associated with an account.

aws healthlake list-fhir-import-jobs \ --datastore-id (Data store ID) \ --submitted-before (DATE like 2024-10-13T19:00:00Z) \ --submitted-after (DATE like 2020-10-13T19:00:00Z ) \ --job-name "FHIR-IMPORT" \ --job-status SUBMITTED \ -max-results (Integer between 1 and 500)

Output:

{ "ImportJobPropertiesList": [ { "JobId": "c0fddbf76f238297632d4aebdbfc9ddf", "JobStatus": "COMPLETED", "SubmitTime": "2024-11-20T10:08:46.813000-05:00", "EndTime": "2024-11-20T10:10:09.093000-05:00", "DatastoreId": "(Data store ID)", "InputDataConfig": { "S3Uri": "s3://(Bucket Name)/(Prefix Name)/" }, "JobOutputDataConfig": { "S3Configuration": { "S3Uri": "s3://(Bucket Name)/import/6407b9ae4c2def3cb6f1a46a0c599ec0-FHIR_IMPORT-c0fddbf76f238297632d4aebdbfc9ddf/", "KmsKeyId": "arn:aws:kms:us-east-1:123456789012:key/b7f645cb-e564-4981-8672-9e012d1ff1a0" } }, "JobProgressReport": { "TotalNumberOfScannedFiles": 1, "TotalSizeOfScannedFilesInMB": 0.001798, "TotalNumberOfImportedFiles": 1, "TotalNumberOfResourcesScanned": 1, "TotalNumberOfResourcesImported": 1, "TotalNumberOfResourcesWithCustomerError": 0, "TotalNumberOfFilesReadWithCustomerError": 0, "Throughput": 0.0 }, "DataAccessRoleArn": "arn:aws:iam::(AWS Account ID):role/(Role Name)" } ] }

Python
SDK for Python (Boto3)
@classmethod def from_client(cls) -> "HealthLakeWrapper": """ Creates a HealthLakeWrapper instance with a default AWS HealthLake client. :return: An instance of HealthLakeWrapper initialized with the default HealthLake client. """ health_lake_client = boto3.client("healthlake") return cls(health_lake_client) def list_fhir_import_jobs( self, datastore_id: str, job_name: str = None, job_status: str = None, submitted_before: datetime = None, submitted_after: datetime = None, ) -> list[dict[str, any]]: """ Lists HealthLake import jobs satisfying the conditions. :param datastore_id: The data store ID. :param job_name: The import job name. :param job_status: The import job status. :param submitted_before: The import job submitted before the specified date. :param submitted_after: The import job submitted after the specified date. :return: A list of import jobs. """ try: parameters = {"DatastoreId": datastore_id} if job_name is not None: parameters["JobName"] = job_name if job_status is not None: parameters["JobStatus"] = job_status if submitted_before is not None: parameters["SubmittedBefore"] = submitted_before if submitted_after is not None: parameters["SubmittedAfter"] = submitted_after next_token = None jobs = [] # Loop through paginated results. while True: if next_token is not None: parameters["NextToken"] = next_token response = self.health_lake_client.list_fhir_import_jobs(**parameters) jobs.extend(response["ImportJobPropertiesList"]) if "NextToken" in response: next_token = response["NextToken"] else: break return jobs except ClientError as err: logger.exception( "Couldn't list import jobs. Here's why %s", err.response["Error"]["Message"], ) raise
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository.

AWS CLI

To list all FHIR import jobs

The following list-fhir-import-jobs example shows how to use the command to view a list of all import jobs associated with an account.

aws healthlake list-fhir-import-jobs \ --datastore-id (Data store ID) \ --submitted-before (DATE like 2024-10-13T19:00:00Z) \ --submitted-after (DATE like 2020-10-13T19:00:00Z ) \ --job-name "FHIR-IMPORT" \ --job-status SUBMITTED \ -max-results (Integer between 1 and 500)

Output:

{ "ImportJobPropertiesList": [ { "JobId": "c0fddbf76f238297632d4aebdbfc9ddf", "JobStatus": "COMPLETED", "SubmitTime": "2024-11-20T10:08:46.813000-05:00", "EndTime": "2024-11-20T10:10:09.093000-05:00", "DatastoreId": "(Data store ID)", "InputDataConfig": { "S3Uri": "s3://(Bucket Name)/(Prefix Name)/" }, "JobOutputDataConfig": { "S3Configuration": { "S3Uri": "s3://(Bucket Name)/import/6407b9ae4c2def3cb6f1a46a0c599ec0-FHIR_IMPORT-c0fddbf76f238297632d4aebdbfc9ddf/", "KmsKeyId": "arn:aws:kms:us-east-1:123456789012:key/b7f645cb-e564-4981-8672-9e012d1ff1a0" } }, "JobProgressReport": { "TotalNumberOfScannedFiles": 1, "TotalSizeOfScannedFilesInMB": 0.001798, "TotalNumberOfImportedFiles": 1, "TotalNumberOfResourcesScanned": 1, "TotalNumberOfResourcesImported": 1, "TotalNumberOfResourcesWithCustomerError": 0, "TotalNumberOfFilesReadWithCustomerError": 0, "Throughput": 0.0 }, "DataAccessRoleArn": "arn:aws:iam::(AWS Account ID):role/(Role Name)" } ] }

For a complete list of AWS SDK developer guides and code examples, see Using HealthLake with an AWS SDK. This topic also includes information about getting started and details about previous SDK versions.

隐私网站条款Cookie 首选项
© 2025, Amazon Web Services, Inc. 或其附属公司。保留所有权利。