Starting a FHIR import job
Use StartFHIRImportJob
to start a FHIR import job into a HealthLake data store. The following
menus provide a procedure for the AWS Management Console and code examples for the AWS CLI and AWS SDKs. For
more information, see StartFHIRImportJob
in the AWS HealthLake API Reference.
To start a FHIR import job
Choose a menu based on your access preference to AWS HealthLake.
- CLI
-
- AWS CLI
-
To start a FHIR import job
The following start-fhir-import-job
example shows how to start a FHIR import job using Amazon HealthLake.
aws healthlake start-fhir-import-job \
--input-data-config S3Uri="s3://(Bucket Name)/(Prefix Name)/" \
--datastore-id (Datastore
ID)
\
--data-access-role-arn "arn:aws:iam::(AWS Account ID):role/(Role Name)"
\
--region us-east-1
Output:
{
"DatastoreId": "(Datastore ID)",
"JobStatus": "SUBMITTED",
"JobId": "c145fbb27b192af392f8ce6e7838e34f"
}
For more information, see Importing files to a FHIR Data Store 'https://docs.aws.amazon.com/healthlake/latest/devguide/import-datastore.html in the Amazon HeatlhLake Developer Guide.
- Python
-
- SDK for Python (Boto3)
-
@classmethod
def from_client(cls) -> "HealthLakeWrapper":
"""
Creates a HealthLakeWrapper instance with a default AWS HealthLake client.
:return: An instance of HealthLakeWrapper initialized with the default HealthLake client.
"""
health_lake_client = boto3.client("healthlake")
return cls(health_lake_client)
def start_fhir_import_job(
self,
job_name: str,
datastore_id: str,
input_s3_uri: str,
job_output_s3_uri: str,
kms_key_id: str,
data_access_role_arn: str,
) -> dict[str, str]:
"""
Starts a HealthLake import job.
:param job_name: The import job name.
:param datastore_id: The data store ID.
:param input_s3_uri: The input S3 URI.
:param job_output_s3_uri: The job output S3 URI.
:param kms_key_id: The KMS key ID associated with the output S3 bucket.
:param data_access_role_arn: The data access role ARN.
:return: The import job.
"""
try:
response = self.health_lake_client.start_fhir_import_job(
JobName=job_name,
InputDataConfig={"S3Uri": input_s3_uri},
JobOutputDataConfig={
"S3Configuration": {
"S3Uri": job_output_s3_uri,
"KmsKeyId": kms_key_id,
}
},
DataAccessRoleArn=data_access_role_arn,
DatastoreId=datastore_id,
)
return response
except ClientError as err:
logger.exception(
"Couldn't start import job. Here's why %s",
err.response["Error"]["Message"],
)
raise
Can't find what you need? Request a code example using the Provide
feedback link on the right sidebar of this page.
-
Sign in to the Data stores page on the HealthLake Console.
-
Choose a data store.
-
Choose Import.
The Import page opens.
-
Under the Input data section, enter the following
information:
-
Under the Import output files section, enter the following
information:
-
Under the Access permissions section, choose Use an
existing IAM service role and select the role from the Service
role name menu or choose Create an IAM role.
-
Choose Import data.
During import, choose Copy job ID on the banner at the top
of the page. You can use the JobID
to request import job properties using the AWS CLI. For
more information, see Getting FHIR import job properties.