top | item 38359540

(no title)

pranav_rajs | 2 years ago

Hey HN,

Thanks for all the support you folks gave us in the past[1]. At Chatwoot[2], we are developing an open-source[3] customer support platform as an alternative to Intercom and Zendesk. I'm glad to announce our latest AI update. We have been working with the latest LLMs to see if we can assist support staff in reducing their workload.

We have observed large-scale Chatwoot deployments with over 10 million messages. From our discussions with the teams handling these deployments, we noticed two types of queries. The first type consists of simple questions, sometimes with minor tweaks. The second type requires human intervention and typically involves more complex support queries. Many tools first employ an FAQ bot, now an AI bot, before transferring to a human. This process can be frustrating for both the user and the support agent. We have found that customers, especially when they are already frustrated, dislike having to interact with a bot before reaching a human to resolve their issue.

The issue we face is the lack of distinction between the two types of queries. Not all queries are inherently support queries. Sometimes, the customer may not want to search through the documentation, or the documentation may not be well-organized enough to find the answer. Having a chat common chat interface makes it harder for everyone.

We believe that a clear UX distinction can make a significant difference. Here is how we plan to split these conversations.

a/ *Universal search interface*: This is a Chat GPT-like chat that you can integrate into your application. Users can click on the search icon or use CMD+K to access it. They can find answers using your help desk articles and carry out tasks through natural language queries like "Where is my order?" or "When is the next billing date?". We have found this can improve the quality of support requests received.

b/ *Live Chat SDK*: To address complex issues, your customers can communicate directly with you with the live chat SDK.

This approach will clarify expectations for the end customer. Use the search function for straightforward queries and engage with an agent for more complicated ones.

Along with these, we also introduced some updates to the agent interface. These updates will help agents prioritize conversations by auto-tagging them and routing them to the appropriate teams.

We are launching it as a closed beta at the moment to work closely with teams with scale and unique problems. Also, we have not yet figured out how we can add support for the open-source LLMs in different environments, as our installations are one-click at the moment. Adding those would make it complex for everyone.

[1] https://news.ycombinator.com/item?id=26501527

[2] https://www.chatwoot.com

[3] https://github.com/chatwoot/chatwoot

discuss

order

No comments yet.