Architecture details
This section describes the components and AWS services that make up this solution and the architecture details on how these components work together.
AWS services in this solution
The following AWS services are included in this solution:
AWS service | Description |
---|---|
Amazon
API Gateway |
Core. Used for internal API management. |
AWS CloudFormation |
Core. Used to deploy the solution. |
Amazon CloudWatch |
Core. Used for monitoring and logs. |
Amazon Cognito |
Core. Used for user management. |
AWS Identity and Access Management |
Core. Used for user role and permissions management. |
AWS Key Management Service |
Core. Used for encryption. |
AWS Lambda |
Core. Provides logic for chatbot interactions and provides extension capabilities for Amazon Translate before and after interaction with Amazon Lex. |
Amazon Lex |
Core. Provides the advanced deep learning functionalities of ASR for converting speech to text, and NLU to recognize the intent of the text. |
Amazon OpenSearch Service |
Core. Provides a question bank index that LLMs search to generate responses. |
Amazon SNS |
Core. Used for notifications, such as feedback. |
Amazon Data Firehose |
Supporting. Delivers logs and metrics data to an Amazon S3 bucket. |
Amazon Polly |
Supporting. Used for Interactive Voice Response systems. It provides text to speech capabilities to relay the response back in the voice of choice. |
Amazon S3 |
Supporting. Provides object storage for content designer UI data and logs and metrics data. |
AWS Systems Manager Parameter Store |
Supporting. Provides secure, hierarchical storage for configuration data management and secrets management. |
Amazon Translate |
Supporting. Provides multi-language support to your customer’s bot interactions. You can maintain question and answer banks in a single language while still offering support to customers who interact with the bot in other languages through the use of Amazon Translate. |
Amazon
Bedrock |
Optional. Provides an optional host for LLMs |
Amazon
Connect |
Optional. Provides an omnichannel cloud contact center. If you implement this component, you can create personalized experiences for your customers. For example, you can dynamically offer chat and voice contact, based on such factors as customer preference and estimated wait times. Agents, meanwhile, conveniently handle all customers from just one interface. For example, they can chat with customers, and create or respond to tasks as they are routed to them. |
Amazon Kendra |
Optional. Hosts unstructured datasets hosted in an index. You can also use Amazon Kendra to provide semantic search capabilities to your question bank through the use of Amazon Kendra FAQs. |
Amazon SageMaker AI |
Optional. Provides an optional host for an inference endpoint used to generate text embeddings on your queries. These text embeddings transform QnABot’s keyword matching system to instead match on the meaning or “semantic similarity” of two different queries based on the similarity of their embedding vectors. |
Amazon Lex web client
Amazon Lex allows conversational interfaces to be integrated into applications like the
Amazon Lex web client. An Amazon Lex chatbot uses intents to encapsulate
the purpose of an interaction, and slots to capture elements of
information from the interaction. Since QnABot on AWS has a single purpose, to answer a
user’s question, it defines just one intent. This intent has a single slot which is trained to
capture the text of the question. QnABot on AWS also uses AMAZON.FallBackIntent
to ensure that all user input is processed. To learn more about how Amazon Lex bots work, and
to understand the concepts of intents, slots, sample values, fulfillment functions, see the
Amazon Lex Developer
Guide.
The QnABot on AWS Amazon Lex web client is deployed to an Amazon S3 bucket in your account, and accessed via Amazon API Gateway.
Amazon Alexa devices
Amazon Alexa devices interact with QnABot on AWS using an Alexa skill. Like an Amazon Lex chatbot, an Alexa skill also uses intents to encapsulate the purpose of an interaction, and slots to capture elements of information from the interaction.
The Alexa QnABot on AWS skill uses the same Bot fulfillment
Lambda function
as the Amazon Lex chatbot. When you ask a question, for example, “Alexa, ask Q and A,
How can I include pictures in Q and A Bot answers?”, your Alexa device interacts
with the skill you created, which in turn invokes the Bot fulfillment
Lambda
function in your AWS account, passing the transcribed question as a parameter.
Content designer UI
The QnABot on AWS content designer UI, like the Amazon Lex web client, is also deployed to an Amazon S3 bucket and accessed via Amazon API Gateway, and it too retrieves configuration from an API Gateway endpoint. The content designer UI requires the user to sign in with credentials defined in a Cognito user pool.
Using temporary AWS credentials from Cognito, the content designer UI interacts with secure API Gateway endpoints backed by the content designer Lambda functions. All interactions with Amazon OpenSearch Service and Amazon Lex are handled by these Lambda functions.