Create a free account, or log in

Dovetail enhances customer insights with new generative AI products developed in weeks using Amazon Bedrock

Learn how Dovetail accelerates development of new generative AI products and features using Amazon Bedrock, deploying the latest large language models while reducing security and administrative overhead.
Amazon Web Services
generative AI
Dovetail co-founders Bradley Ayers and Benjamin Humphrey. Source: Supplied.

Founded in Australia in 2017, Dovetail helps businesses analyse unstructured data from customer videos, audio recordings, and documents. Initially, the platform focused on offering productivity features for manual data handling, such as creating highlights and adding tags.

As large language models (LLMs) emerged, the startup began improving its products to automate processes to improve efficiency, such as identifying key moments in transcripts and classifying support tickets. The company’s journey into AI and machine learning started with automated transcription and basic sentiment analysis. 

Recognising the potential of LLMs, Dovetail saw an opportunity to develop new features to enhance data aggregation and pattern recognition across customer communication channels. 

Deploying generative AI would provide a higher level of end-to-end automation, greatly accelerating customers’ workflows and helping them uncover insights faster than ever. 

“We needed a managed service to access the latest LLMs and embedding models for rapid prototyping and delivery,” Benjamin Humphrey, co-founder and CEO at Dovetail, says.

Dovetail sought to integrate generative AI to provide users with more efficient tools for analysing unstructured data, enhancing the platform’s power and intuitiveness.

Learn more about how startups like Dovetail are using AWS and generative AI to gain a competitive edge at the AWS Data and AI Day in Melbourne (4 March 2025) and Sydney (6 March 2025).

Expanding generative AI capabilities cost-effectively and securely on Amazon Bedrock

When AWS introduced Amazon Bedrock, a fully managed service offering a range of high-performing foundation models, Dovetail was invited to join the service preview cohort. 

Previously, the business had used Amazon SageMaker to deploy deep learning models but needed a more flexible solution for its variable workloads. 

With Bedrock, Dovetail could access serverless compute on demand and autoscale its APIs during peak periods. 

“Amazon Bedrock has proven to be a more cost-effective, flexible, and seamless way to integrate and deploy generative AI capabilities,” Peter Wooden, a software engineer at Dovetail, says.

Another key factor in its decision to choose Amazon Bedrock was the transparency AWS provided about the underlying technology and processes. 

AWS ensures that customer data remains confidential, without being used to train models or test concepts, thereby safeguarding the data clients entrust to Dovetail.

This focus on data security extends throughout business operations. By using AWS, Dovetail — and its customers — benefit from enhanced security with reduced overhead. 

All customer data remains securely within Dovetail’s private network, spanning from application servers to its databases.  

The company meets data residency requirements for customers by leveraging multiple AWS regions to host customer data. 

Meanwhile, it simplifies compliance by developing its generative AI capabilities on AWS. This approach eliminates the need to notify clients about third-party sub-processors and reduces the risk of clients rejecting those processors.

Boosting data analysis efficiency by 80%, saving users 10 hours weekly

By using Amazon Bedrock, Dovetail maintains a rapid pace of innovation while controlling costs. “Amazon Bedrock’s serverless design boosts our product development speed,” Wooden says.

 “We focus on writing the code that sets our product apart, without worrying about infrastructure or provisioning. Now we can create prototypes in under a day and launch new features within weeks.”

Its AI-powered features speed up workflows for businesses, helping them curate end-customer data on a deeper level and unlocking insights faster. This allows designers, product managers, and salespeople to personalize their services more effectively. 

One business utilising Dovetail is Instawork, a flexible work app that connects local businesses with skilled hourly workers. 

“Magic features live up to Dovetail’s goal of bringing us to insight faster. While we still take every step in analysing, synthesising, and reviewing our research data,” Emi Fogg, UX researcher at Instawork, says. 

“Its AI features like Magic search and Magic cluster have sped up the process, saving us around 1–2 hours per project, freeing up more time to focus on nuance instead of broad strokes.” 

Building on the advanced capabilities of Amazon Bedrock, Dovetail introduced the preview version of its Channels product in early 2024. 

By deploying a range of Anthropic Claude models, including Claude 3.5 Sonnet on Amazon Bedrock, the company launched new Magic features to enhance data analysis. According to a recent poll, product managers and designers using Magic features save an average of 10 hours weekly on data analysis and are 80 percent more efficient in arriving at insights.

Among these innovations, the Magic search feature has been particularly well received by customers. 

Previously, users could only perform keyword searches on data within the Dovetail system. Now, with Magic search, users can ask natural language questions and receive detailed summaries as answers, based on relevant results. 

Additionally, Dovetail Channels uses Anthropic Claude models on Amazon Bedrock to support customers in running its Voice of the Customer program, classifying customer feedback into actionable themes and helping customers stay attuned to omnichannel user feedback.

Dovetail plans to keep refining its generative AI product offerings. 

“Dovetail is a prime example of how generative AI can accelerate daily tasks and make a tangible impact for customer-centered businesses,” Humphrey says. 

“It’s been an exciting journey exploring these technologies, and we’re always looking for new ways to leverage the latest generative AI on AWS services to push our platform even further.”