Architecture details
This section describes the components and AWS services that make up this solution and the architecture details on how these components work together.
AWS services in this solution
The following AWS services are included in this solution:
AWS service | Description |
---|---|
Core. Used for internal API management. |
|
Core. Used to deploy the solution. |
|
Core. Used for monitoring and logs. |
|
Core. Used for user management. |
|
Core. Used for user role and permissions management. |
|
Core. Used for encryption. |
|
Core. Provides logic for chatbot interactions and provides extension capabilities for Amazon Translate before and after interaction with Amazon Lex. |
|
Core. Provides the advanced deep learning functionalities of ASR for converting speech to text, and NLU to recognize the intent of the text. |
|
Core. Provides question bank, metrics, feedback indices, and provides OpenSearch Dashboards for chatbot usage. |
|
Core. Used for notifications, such as feedback. |
|
Supporting. Delivers logs and metrics data to an Amazon S3 bucket. |
|
Supporting. Used for Interactive Voice Response systems. It provides text to speech capabilities to relay the response back in the voice of choice. |
|
Supporting. Provides object storage for content designer UI data and logs and metrics data. |
|
Supporting. Provides secure, hierarchical storage for configuration data management and secrets management. |
|
Supporting. Provides multi-language support to your customer’s bot interactions. You can maintain question and answer banks in a single language while still offering support to customers who interact with the bot in other languages through the use of Amazon Translate. |
|
Optional. This solution utilizes Bedrock for embedding models, LLM models, knowledge base, and guardrails. |
|
Optional. Provides an omnichannel cloud contact center. If you implement this component, you can create personalized experiences for your customers. For example, you can dynamically offer chat and voice contact, based on such factors as customer preference and estimated wait times. Agents, meanwhile, conveniently handle all customers from just one interface. For example, they can chat with customers, and create or respond to tasks as they are routed to them. |
|
Optional. Hosts unstructured datasets hosted in an index. You can also use Amazon Kendra to provide semantic search capabilities to your question bank through the use of Amazon Kendra FAQs. |
Amazon Lex web client
Amazon Lex allows conversational interfaces to be integrated into applications such as the Amazon Lex web client. An Amazon Lex chatbot uses intents to encapsulate the purpose of an interaction, and slots to capture elements of information from the interaction. Since QnABot on AWS has a single purpose, to answer a user’s question, it defines just one intent. This intent has a single slot which is trained to capture the text of the question. QnABot on AWS also uses AMAZON.FallBackIntent
to ensure that all user input is processed. To learn more about how Amazon Lex bots work, and to understand the concepts of intents, slots, sample values, fulfillment functions, see the Amazon Lex Developer Guide.
The QnABot on AWS Amazon Lex web client is deployed to an Amazon S3 bucket in your account, and accessed via Amazon API Gateway.
Amazon Alexa devices
Amazon Alexa devices interact with QnABot on AWS using an Alexa skill. Like an Amazon Lex chatbot, an Alexa skill also uses intents to encapsulate the purpose of an interaction, and slots to capture elements of information from the interaction.
The Alexa QnABot on AWS skill uses the same Bot fulfillment
Lambda function as the Amazon Lex chatbot. When you ask a question, for example,
"Alexa, ask Q and A, How can I include pictures in Q and A Bot answers?"
, your Alexa device interacts with the skill you created, which in turn invokes the Bot fulfillment
Lambda function in your AWS account, passing the transcribed question as a parameter.
Content designer UI
The QnABot on AWS content designer UI, like the Amazon Lex web client, is also deployed to an Amazon S3 bucket and accessed via Amazon API Gateway, and it too retrieves configuration from an API Gateway endpoint. The content designer UI requires the user to sign in with credentials defined in a Cognito user pool.
Using temporary AWS credentials from Cognito, the content designer UI interacts with secure API Gateway endpoints backed by the content designer Lambda functions. All interactions with Amazon OpenSearch Service and Amazon Lex are handled by these Lambda functions.