With the rise of LLM (large language models), many enterprises are looking for ways to solve business problems through LLM to achieve cost reduction and efficiency improvement. However, when facing relatively complex business problems (such as: a lot of background information, multiple problem classifications, complex conditional judgments, involvement of multiple modules, etc.), with the current development level of LLM, it is difficult to effectively solve such problems through simple LLM conversational interaction alone, because LLM also has its own capability limitations, such as hallucinations, context, etc.
To solve such problems, the GPTBots platform developed by Aurora has created a "Flow" bot building model. This model can simplify complex business problems and abstract them into multiple "nodes" and "processes" through "Flow", and quickly build a bot through visual programming of workflows, so that multiple dedicated LLMs can work in parallel/serial, each with its own responsibility, and then immediately apply it to actual business problem scenarios.
This article will take building a simple "after-sales service bot" as the goal to tell you about the actual application process of "Flow".
Identify the problem and define the goal
Designing a bot is the same as designing a product. First, you need to define what problem the bot needs to solve and what the goal of the bot is.
In this example, we need to solve the after-sales service problem. The original after-sales service may include (but is not limited to):
- A large number of consultations require a lot of manpower investment;
- The content of the consultation is very broad, meaning high knowledge requirements for customer service personnel;
- ...
If there is a bot specifically for handling after-sales inquiries, it can help companies improve after-sales service efficiency while reducing after-sales service costs (manpower, time, management resources, etc.).
Business combing
After defining the problem and goal, we need to plan the bot's workflow according to the actual business scenario.
In this example, we can design such a simple after-sales problem handling workflow:
- When customers ask questions related to support services, the bot will answer customers based on the knowledge of the "Support Services" document;
- When customers ask about returns, the bot will answer customers based on the knowledge of the "Returns" document;
- When customers want to check express tracking, the bot calls the predefined express query plugin and returns the result to the customer;
- When customers make refund requests, since this business involves capital flow and is more sensitive, we choose to let the bot call the predefined refund handling plugin to execute the refund for the customer and return the result;
- When the customer's question does not belong to any of the above types, we set a default reply message for the customer, for example, let the customer contact human customer service through a certain contact method.
Of course, there are far more than five scenarios in real enterprise after-sales services. Enterprise developers can design bots according to actual situations.
Knowledge classification
After defining the business process, we need to prepare after-sales knowledge data for the bot. The reason is that since it is an after-sales bot, this bot must be familiar with all the company's after-sales service knowledge in order to respond reasonably to customers' problems based on the knowledge.
The most important thinking in preparing bot knowledge is "classification".
In after-sales service scenarios, after-sales service personnel will encounter many different problems, such as can I return goods? Where to see the return progress? Can I exchange goods? How to exchange? How long does it take for the refund to return to my account? Where did the courier deliver? And so on.
As shown in the figure above, we can classify the Q&A scenarios in the business combing results, namely the common scenarios that need Q&A in the support services and returns scenarios, and organize the knowledge documents based on these classifications. For example, "Support Services" and "Returns".
Clear knowledge classification helps bots learn and understand knowledge, and provide more accurate answers when facing customer inquiries.
Bot design
Create FlowBot
After completing the knowledge organization and workflow planning, we can officially enter the GPTBots platform to build the FlowBot.
In the [FlowBot] module, first create an after-sales service bot, and enter the settings interface.
Knowledge learning
In the [Knowledge Base] module, submit the previously organized "Support Services" and "Returns" documents to the bot for learning and training. GPTBots platform provides multiple types of knowledge upload methods.
In this example, we choose "Q&A" as the knowledge document management format. This is a knowledge storage format with "Q&A pairs" as the basic unit. When customers ask questions similar in semantics to some of the questions (Q) in it, the bot can quickly find the corresponding answers (A) and organize the language through LLM to answer customers.
Flow design
After configuring the knowledge documents, we can start the Flow design phase.
Enter the visual editing interface through [Edit].
Every Flow consists of three basic elements:
- Input: Customer input information;
- Output: Information output to customers as a response after a series of Flow processing;
- Component: Various data processing components, including LLM, knowledge retrieval, conditional judgment and preset responses.
Scene judgment
According to our previous planning of the business process, after the customer raises a question, we need to first judge the customer's question and enter the corresponding response process according to the customer's question classification.
Therefore, the first step in Flow design is to set the next step of [Input] to a [Branch Judgment] component, and set branches such as [Refund], [Express Tracking] and [Returns] in natural language.
Through this component, the bot will judge which process to continue based on the customer's question content.
Knowledge Q&A: Returns, Support Services
Take [Returns] as an example. If the customer's question is judged to be a return, the next thing to do is to find appropriate answer content from the knowledge document for the customer's question.
We can add a [Knowledge Vector] component here, and connect the [Returns] in the [Branch Judgment] to this. In [Knowledge Vector], we select the previously uploaded and trained "Returns" document as the knowledge retrieval material for this component.
Since the results retrieved by the [Knowledge Vector] component are relatively "dry" knowledge fragments that cannot be directly used to reply to customers, we need to provide the customer's question and knowledge retrieval results to the LLM in the next step for the LLM to understand, analyze, and ultimately organize an appropriate answer.
Finally, connect the LLM output information to the [Output] component to complete the entire workflow. The above workflow can also be reused in the [Support Services] branch with the same design ideas.
Plugin invocation: Express tracking, Refund
When the customer's request is determined to need to query express tracking, we can set an LLM component and add the [Express Tracking] plugin for this component.
The LLM will extract the order number that needs to query the express delivery from the customer's request, and return the logistics progress of the order number to the customer by calling the plugin.
Similarly, if the LLM identifies a [Refund] intent from the customer's request, it will guide the customer to provide necessary information and help the customer submit requests through plugins to execute the refund process.
Preset response: Other
Finally, we need to supplement a "fallback" branch. When the customer's question does not belong to any of the branches set in [Branch Judgment], we set a general reply message through the [Preset Response] component to let the customer directly contact the enterprise customer service personnel for further manual problem solving.
Flow overview
After the above series of operations, remember to [Save], and this simple "after-sales service bot" is built.
Flow verification
After [Save], we can test the bot's reply effect directly through [Run] or [Dialogue].
In the [Input] component, enter the customer's question content, and click [Run]. The interface will show the data flow path of the question in a dotted line, and you can see the final output result in the final [Output] component.
Similarly, you can also verify the bot output effect directly in a conversational way through [Dialogue].