Back

OpenAsset

Deveshi Dabbawala

October 17, 2024
Table of Content

Business Problem

OpenAsset needed to enhance its data query and retrieval process by integrating an LLM (Large Language Model) application that would:

  • Convert natural language queries into structured SQL-based responses.
  • Fetch accurate data from the Aurora DB to improve overall user experience.
  • Automate and streamline the response process using AWS infrastructure to reduce manual intervention.

About OpenAsset

OpenAsset is a project-based digital asset management (DAM) solution for real estate and AEC (Architecture, Engineering, and Construction) industries. It helps firms manage image libraries, streamline workflows, and create more efficient, high-performing proposals. With over 20 years of experience, OpenAsset has supported over 700 firms globally, enabling better visualization and presentation of projects, leading to increased business performance.

Solution

The solution involved developing and deploying a prototype LLM application using AWS Bedrock to enhance OpenAsset's data retrieval and response system. This application processes natural language queries, converts them into SQL queries, retrieves relevant data from Aurora DB, and generates natural language responses using AWS Bedrock's Claude Sonnet LLM. The entire process was implemented with a well-structured tech stack:

Programming:
Python was used to write the application's core logic, while FastAPI served as the framework for creating the API endpoints. Mangum was used to deploy FastAPI on AWS Lambda seamlessly.

AWS Services:
AWS Bedrock powered the core GenAI functionalities, enabling LLM-based query processing, while Aurora DB and DynamoDB were used for data storage and retrieval.

Database:
Data was stored and queried from MySQL and AWS services, including Aurora DB and DynamoDB, ensuring reliable and scalable data storage and access.

Infrastructure Management:
Terraform was used for infrastructure management, allowing automated deployment across AWS resources.

Framing Responses Using LLM:
The fetched data from Aurora DB is combined with the user's original query to frame a detailed, accurate response using AWS Bedrock’s Claude Sonnet LLM.

Additional Tools:
The axomic library was integrated to work with OpenAsset’s digital assets, and GitHub was used for version control and collaboration.

Click to View in Full Size

Architecture

  • User Input
    - Accepts natural language queries and DB usernames.
  • Session History
    - Retrieves session history and merges it with current input.
  • Query Classification
    - Uses LLM to classify the domain and generate SQL queries.
  • SQL Execution
    - Makes API calls to AWS Bedrock for query generation.
    - Executes SQL queries to fetch relevant data.
  • Natural Language Response
    - Aggregates data and generates final user responses in natural language.
  • Database Update
    - Updates DB with query results and disconnects.

Outcomes

40%
Efficiency boost in reducing query handling time
35%
Improved data accuracy with precision in information retrieval
30%
Increase in customer satisfaction, fostering brand loyalty and retention