Back

How Ledgebrook used a chatbot for underwriting to cut document retrieval time by 70%

Deveshi Dabbawala

February 24, 2025
Table of contents

As Ledgebrook’s underwriting operations grew in complexity, so did the need for instant access to critical information. Underwriters struggled to find relevant guidelines and compare them with submitted documents, often searching across disconnected systems.  

To overcome this challenge, GoML developed an AI-powered chatbot using AWS technologies, enabling fast, natural language-based access to underwriting knowledge.

The problem: scattered documents, manual queries, and slow decisions

Underwriters at Ledgebrook were spending a significant amount of time searching for underwriting guidelines and matching them to submitted policy documents. The information they needed was buried in siloed systems, requiring manual lookups across multiple databases and files.  

Without a conversational interface, every query required switching tools and piecing together insights manually. As the volume of underwriting requests grew, these inefficiencies compounded, slowing down response times and making it hard to scale operations effectively. The company needed a centralized solution that could unify access, interpret natural queries, and surface accurate information quickly.

The solution: intelligent chatbot for underwriting

GoML built a conversational AI interface that integrates deeply with Ledgebrook’s document and underwriting systems.

Conversational Interface for Underwriters

  • Using AWS Bedrock, GoML developed a natural language interface that allows underwriters to ask questions in plain English.  
  • Whether they needed policy guidelines or document context, the chatbot could respond with precision.

Automated Document Lookups

  • The chatbot was connected to OpenSearch Serverless to index Ledgebrook’s underwriting documents.  
  • This enabled instant retrieval of relevant records based on a query, without having to manually search directories.

Policy and Document Comparison

  • The AI could pull documents from AWS S3, identify key terms, and compare them to underwriting rules.  
  • PostgreSQL RDS, automating a traditionally manual process is being stored.

Seamless API Integration

  • AWS Lambda acted as a connector between the chatbot and other backend services.  
  • Enabling smooth orchestration and action-triggering based on user queries.

Secure and Scalable Deployment

  • The solution was deployed in a VPC-based AWS environment to ensure high security.  
  • Availability with the ability to scale up as user demands increased.

Search and Retrieval Optimization

  • Document vectorization using OpenSearch improved search quality.  
  • Allowed the chatbot to understand the context behind each query, increasing accuracy.
Chatbot for underwriting

The impact: fast responses, scalable operations, and smarter underwriting

  • 70% reduction in retrieval time, enabling underwriters to get information instantly
  • 85% improvement in user efficiency, reducing time wasted on manual lookups
  • 60% faster decisions, through conversational, real-time access to guidelines and documents

Lessons for other insurance and insure-tech companies

Common pitfalls to avoid

  • Keeping underwriting data siloed across systems
  • Relying on manual search instead of vectorized document retrieval
  • Delaying chatbot integration due to perceived complexity

Advice for teams facing similar challenges

  • Use natural language interfaces to reduce training time and increase adoption
  • Connect your chatbot to real-time systems and databases for dynamic insights
  • Leverage AWS services like Bedrock and OpenSearch for secure, scalable performance

Want to reduce document lookup time by 70%? Schedule an exective AI briefing.

Outcomes

70%
Reduction in time spent to retrieve underwriting data
85%
Improvement in user efficiency
60%
Faster decision-making