Create your first Hybrid Job
This section shows you how to create a Hybrid Job using a Python script. Alternatively, to create a hybrid job from local Python code, such as your preferred integrated development environment (IDE) or a Braket notebook, see Running your local code as a hybrid job.
In this section:
Set permissions
Before you run your first hybrid job, you must ensure that you have sufficient permissions to proceed with this task. To determine that you have the correct permissions, select Permissions from the menu on left side of the Braket Console. The Permissions management for Amazon Braket page helps you verify whether one of your existing roles has permissions that are sufficient to run your hybrid job or guides you through the creation of a default role that can be used to run your hybrid job if you do not already have such a role.
To verify that you have roles with sufficient permissions to run a hybrid job, select the Verify existing role button. If you do, you get a message that the roles were found. To see the names of the roles and their role ARNs, select the Show roles button.
If you do not have a role with sufficient permissions to run a hybrid job, you get a message that no such role was found. Select the Create default role button to obtain a role with sufficient permissions.
If the role was created successfully, you get a message confirming this.
If you do not have permissions to make this inquiry, you will be denied access. In this case, contact your internal AWS administrator.
Create and run
Once you have a role with permissions to run a hybrid job, you are ready to proceed. The key piece of your first Braket hybrid job is the algorithm script. It defines the algorithm you want to run and contains the classical logic and quantum tasks that are part of your algorithm. In addition to your algorithm script, you can provide other dependency files. The algorithm script together with its dependencies is called the source module. The entry point defines the first file or function to run in your source module when the hybrid job starts.
First, consider the following basic example of an algorithm script that creates five bell states and prints the corresponding measurement results.
import os from braket.aws import AwsDevice from braket.circuits import Circuit def start_here(): print("Test job started!") # Use the device declared in the job script device = AwsDevice(os.environ["AMZN_BRAKET_DEVICE_ARN"]) bell = Circuit().h(0).cnot(0, 1) for count in range(5): task = device.run(bell, shots=100) print(task.result().measurement_counts) print("Test job completed!")
Save this file with the name algorithm_script.py in your current
working directory on your Braket notebook or local environment. The algorithm_script.py
file has start_here()
as the planned entry point.
Next, create a Python file or Python notebook in the same directory as the algorithm_script.py file. This script kicks off the hybrid job and handles any asynchronous processing, such as printing the status or key outcomes that we are interested in. At a minimum, this script needs to specify your hybrid job script and your primary device.
Note
For more information about how to create a Braket notebook or upload a file, such as the algorithm_script.py file, in the same directory as the notebooks, see Run your first circuit using the Amazon Braket Python SDK
For this basic first case, you target a simulator. Whichever type of quantum device you
target, a simulator or an actual quantum processing unit (QPU), the device you specify with
device
in the following script is used to schedule the hybrid job and is available
to the algorithm scripts as the environment variable
AMZN_BRAKET_DEVICE_ARN
.
Note
You can only use devices that are available in the AWS Region of your hybrid job. The Amazon Braket SDK auto selects this AWS Region. For example, a hybrid job in us-east-1 can use IonQ, SV1, DM1, and TN1 devices, but not Rigetti devices.
If you choose a quantum computer instead of a simulator, Braket schedules your hybrid jobs to run all of their quantum tasks with priority access.
from braket.aws import AwsQuantumJob from braket.devices import Devices job = AwsQuantumJob.create( Devices.Amazon.SV1, source_module="algorithm_script.py", entry_point="algorithm_script:start_here", wait_until_complete=True )
The parameter wait_until_complete=True
sets a verbose mode so that your job
prints output from the actual job as it’s running. You should see an output similar to the
following example.
job = AwsQuantumJob.create( Devices.Amazon.SV1, source_module="algorithm_script.py", entry_point="algorithm_script:start_here", wait_until_complete=True, ) Initializing Braket Job: arn:aws:braket:us-west-2:<accountid>:job/<UUID> ......................................... . . . Completed 36.1 KiB/36.1 KiB (692.1 KiB/s) with 1 file(s) remaining#015download: s3://braket-external-assets-preview-us-west-2/HybridJobsAccess/models/braket-2019-09-01.normal.json to ../../braket/additional_lib/original/braket-2019-09-01.normal.json Running Code As Process Test job started!!!!! Counter({'00': 55, '11': 45}) Counter({'11': 59, '00': 41}) Counter({'00': 55, '11': 45}) Counter({'00': 58, '11': 42}) Counter({'00': 55, '11': 45}) Test job completed!!!!! Code Run Finished 2021-09-17 21:48:05,544 sagemaker-training-toolkit INFO Reporting training SUCCESS
Note
You can also use your custom-made module with the
AwsQuantumJob.create
Monitor results
Alternatively, you can access the log output from Amazon CloudWatch. To do this, go to the
Log groups tab on the left menu of the job detail
page, select the log group aws/braket/jobs
, and then choose the log stream
that contains the job name. In the example above,
this is braket-job-default-1631915042705/algo-1-1631915190
.
You can also view the status of the hybrid job in the console by selecting the Hybrid Jobs page and then choose Settings.
Your hybrid job produces some artifacts in Amazon S3 while it runs. The default S3 bucket name is
amazon-braket-<region>-<accountid>
and the content is in the
jobs/<jobname>/<timestamp>
directory. You can configure the S3 locations where
these artifacts are stored by specifying a different code_location
when the
hybrid job is created with the Braket Python SDK.
Note
This S3 bucket must be located in the same AWS Region as your job script.
The jobs/<jobname>/<timestamp>
directory contains a subfolder with the output
from the entry point script in a model.tar.gz
file. There is also a directory called
script
that contains your algorithm script artifacts in a source.tar.gz
file. The results from your actual quantum tasks are in the directory named
jobs/<jobname>/tasks
.