Wählen Sie Ihre Cookie-Einstellungen aus

Wir verwenden essentielle Cookies und ähnliche Tools, die für die Bereitstellung unserer Website und Services erforderlich sind. Wir verwenden Performance-Cookies, um anonyme Statistiken zu sammeln, damit wir verstehen können, wie Kunden unsere Website nutzen, und Verbesserungen vornehmen können. Essentielle Cookies können nicht deaktiviert werden, aber Sie können auf „Anpassen“ oder „Ablehnen“ klicken, um Performance-Cookies abzulehnen.

Wenn Sie damit einverstanden sind, verwenden AWS und zugelassene Drittanbieter auch Cookies, um nützliche Features der Website bereitzustellen, Ihre Präferenzen zu speichern und relevante Inhalte, einschließlich relevanter Werbung, anzuzeigen. Um alle nicht notwendigen Cookies zu akzeptieren oder abzulehnen, klicken Sie auf „Akzeptieren“ oder „Ablehnen“. Um detailliertere Entscheidungen zu treffen, klicken Sie auf „Anpassen“.

Bulk importing resources - Amazon Lookout for Equipment
Diese Seite wurde nicht in Ihre Sprache übersetzt. Übersetzung anfragen

Amazon Lookout for Equipment is no longer open to new customers. Existing customers can continue to use the service as normal. For capabilities similar to Amazon Lookout for Equipment see our blog post.

Amazon Lookout for Equipment is no longer open to new customers. Existing customers can continue to use the service as normal. For capabilities similar to Amazon Lookout for Equipment see our blog post.

Bulk importing resources

You can import Amazon Lookout for Equipment resources (datasets and models) from a source AWS account to a target AWS account by using the ImportDataset (datasets) or ImportModelVersion (models) operations. If you need to import multiple resources, we recommend that you use the following scripts to bulk import resources.

  • Resource CSV file script — Scans the source AWS account to get a list of all datasets and their respective active model versions. It then writes the list to an editable CSV file. You run the script in the source AWS account.

  • Resource configuration script — Reads the CSV file generated by the Resource CSV file script and configures the resource policy for the target AWS account. The resource policy grants the target AWS account permissions to import resources from the CSV file. You run this script in the source AWS account.

  • Bulk import script — Reads the CSV file that Resource CSV file script generates and calls ImportDataset on all datasets, and calls ImportModelVersion on the respective model versions. You run the script in the target AWS account, and after first running the Resource configuration script in the source AWS account.

Running the bulk import scripts

Although you can run the scripts in any environment that supports Python and boto3, we recommend that you run the scripts in an Amazon SageMaker AI notebook instance in Jupyter Lab. For more information, see https://jupyter.org/.

Creating the Amazon SageMaker AI notebook instances

Use the following procedure to create Amazon SageMaker AI notebook instances in the source AWS account and the target AWS account.

To create the Amazon SageMaker AI notebook instances
  1. In the AWS account that you want to import resources from (source AWS account), open the Amazon SageMaker AI console and Create a Notebook Instance. For more information, JupyterLab versioning. Enter a name for the new notebook and use the default configurations.

  2. Make sure that the IAM role that you use has following managed policy permissions:

  3. In the target AWS account that you want to bulk import resources into, repeat steps 1 and 2.

Getting the resources from the source AWS account

Use the following procedures to get an editable CSV file of resources that you can import from a source AWS account and configure them for import into a target AWS account.

To get the resources from the source AWS account
  1. In the source AWS account, open Jupyter Lab in the Amazon Sagemaker notebook instance that you created in step 1 of Creating the Amazon SageMaker AI notebook instances.

  2. Copy each of the following scripts into separate cells within the notebook.

  3. Run the Resource CSV file script. The script prompts for the following:

    • The AWS Region in which you want to run the script.

    • The ID of the target AWS account to which you want to import the resources.

    The script generates a CSV file (import_input_file_{current_time}.csv) that you use in the next step. If necessary you can make changes to the CSV before continuing. For more information, see Resource CSV file script

  4. Run the Resource configuration script. The script prompts for the following information.

    • The AWS region in which you want to run the script.

    • Permission to update the existing policy, if the policy already exists for the source resource Amazon Resource Name (ARN).

    • The name and path of the csv file (import_input_file_{current_time}.csv) that you created in step 3.

    For more information, see Resource configuration script.

Importing the resources to the target AWS account

Use the following procedure to import the resources to the target AWS account.

To import the resources into the target AWS account
  1. In the target AWS account, open Jupyter Lab in the Amazon SageMaker AI notebook instance that you created in step 3 of Creating the Amazon SageMaker AI notebook instances.

  2. Copy the Bulk import script into a notebook cell.

  3. Copy the file import_input_file_{current_time}.csv from the source AWS account to the target AWS account, in the same location where this script is located in the jupyter lab.

  4. Run the Bulk import script. The script prompts for the following:

    • The AWS Region in which you want to run the script.

    • The name and path of the csv file (import_input_file_{current_time}.csv) that you copied in step 3.

  5. After the script finishes, check the import results in the CSV file ( import_result_file_{current_time}.csv) that the script creates. For more information, see Bulk import script.

DatenschutzNutzungsbedingungen für die WebsiteCookie-Einstellungen
© 2025, Amazon Web Services, Inc. oder Tochtergesellschaften. Alle Rechte vorbehalten.