Back

Durabuilt Windows & Doors Tensor IOT

Deveshi Dabbawala

December 9, 2024
Table of contents

Business Problem

Durabuilt Windows & Doors is a leading manufacturer in Western Canada that is recognized  for its commitment to quality and innovation. Founded on family entrepreneurship, the company has consistently prioritized growth and improvement, earning a place on Canada's Best Managed Companies list since 2012. Durabuilt's dedication to service and leadership in the market sets it apart as an industry pioneer.

  • Customer Support Challenges: Existing support staff struggle to provide timely and personalized responses to diverse customer inquiries.
  • Efficiency Issues: The current system lacks scalability, leading to longer response times and decreased customer satisfaction.
  • Need for Innovation: The company seeks to leverage an amazing technology to enhance customer interactions and operational efficiency.

Solution

GoML supported TensorIoT in developing a Generative AI chatbot proof of concept, providing expertise in Large Language Models (LLMs) and Amazon Bedrock integration to enhance customer support through fast, personalized responses. This collaboration helped TensorIoT validate scalable, efficient GenAI solutions tailored to Durabuilt's unique requirements.

Initialization Phase:
LLM Model: Pre-trained on Amazon Bedrock, generates responses.
Embedding Model: Converts text to embeddings for semantic search.
OpenSearch Vector Index: Stores and indexes embeddings for quick retrieval.
Amazon S3: Loads data for indexing.
Metadata Update & Index Check: Uses OpenSearch to keep data current.
Retriever Initialization: Prepares retrieval component for relevant info, powered by OpenSearch.

Response Generation & Formatting Phase:
Context Retriever: Fetches context from OpenSearch based on classification.Response Generation: Uses LLM on Amazon Bedrock to create responses.Response Formatting: Structures responses via Python for agent readability.

Query Processing Phase:
Support Agent Query: Captures agent input for classification.
Query Classification: Categorizes queries (General, Date-Based, Out of Context) with LLM and Python.
OpenSearch Index: Enables rapid, contextually relevant retrieval.

Routing Logic for Policy Management - To manage multiple versions of policies with various term changes, the following routing logic is implemented:
Document Routing: Routes customer inquiries to the appropriate document based on policy terms and products.
Document Comparison: If no specific document is found, general documents are utilized to highlight major and minor changes in policy terms.
Service/Warranty Estimation: Provides customers with an estimated leftover service/warranty period, including associated charges and terms, considering the product/service's purchase date and any percentage changes (e.g., 5%, 10%).

Architecture

  • AWS S3 Bucket: The warranty text data (e.g., Website Warranty Text.txt) is stored in Amazon S3.
  • Amazon Bedrock Titan Embedding Model: This embedding model is used to generate embeddings (vector representations) of the warranty text data stored in S3.
  • Amazon OpenSearch Vector Database: The vectorized data (embeddings) from the Bedrock Titan model is stored in the OpenSearch vector database. This allows for efficient vector search and retrieval capabilities.
  • Amazon SageMaker Jupyter Notebook: Acts as a central interface for integrating, orchestrating, and testing the model, database, and embedding workflow. This likely provides the backend logic, processing, and computation.
  • Amazon Bedrock Claude RAG Model: A Retrieval-Augmented Generation (RAG) model is used here to generate responses based on information retrieved from OpenSearch.
  • Gradio UI: Provides a user-friendly interface for end-users to interact with the model and get responses. It communicates with SageMaker to get predictions or information.
  • AWS Console: Provides management and control over the resources and services involved in this architecture. It also links with SageMaker and other components for operational monitoring.
  • Data Flow: The data flow goes from S3 to Bedrock for embedding, then to OpenSearch for storage, and finally to SageMaker for interaction with the Claude RAG model to produce responses, which are then delivered to the Gradio UI for end-users.

Outcomes

60%
Reduction in response time
45%
Faster, personalized support
35%
Lower support-related operational costs