Select your cookie preferences

We use essential cookies and similar tools that are necessary to provide our site and services. We use performance cookies to collect anonymous statistics, so we can understand how customers use our site and make improvements. Essential cookies cannot be deactivated, but you can choose “Customize” or “Decline” to decline performance cookies.

If you agree, AWS and approved third parties will also use cookies to provide useful site features, remember your preferences, and display relevant content, including relevant advertising. To accept or decline all non-essential cookies, choose “Accept” or “Decline.” To make more detailed choices, choose “Customize.”

Detecting Text

Focus mode
Detecting Text - Amazon Textract

Amazon Textract provides synchronous and asynchronous operations that return only the text detected in a document. For both sets of operations, the following information is returned in multiple Block objects:

  • The lines and words of detected text

  • The relationships between the lines and words of detected text

  • The page that the detected text appears on

  • The location of the lines and words of text on the document page

For more information, see Lines and Words of Text.

To detect text synchronously, use the DetectDocumentText API operation, and pass a document file as input. The entire set of results is returned by the operation. For more information and an example, see Processing Documents Synchronously.

Note

The Amazon Rekognition API operation DetectText is different from DetectDocumentText. You use DetectText to detect text in live scenes, such as posters or road signs.

To detect text asynchronously, use StartDocumentTextDetection to start processing an input document file. To get the results, call GetDocumentTextDetection. The results are returned in one or more responses from GetDocumentTextDetection. For more information and an example, see Processing Documents Asynchronously.

PrivacySite termsCookie preferences
© 2025, Amazon Web Services, Inc. or its affiliates. All rights reserved.