A method and system for intelligently entering addresses in express delivery scenarios

An address and intelligent technology, applied in speech analysis, speech recognition, instruments, etc., can solve problems such as high labor costs, low efficiency, and customers' inability to place orders, and achieve the effect of improving accuracy

Active Publication Date: 2021-06-18
科讯嘉联信息技术有限公司
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] This method requires the courier company to prepare a lot of manpower to answer the phone, and the labor cost is very high. For example, if 10,000 customers call and place an order at the same time, the courier company needs to prepare 10,000 manpower to answer the phone, otherwise the customer will not be able to place the order normally. one
[0004] In addition, after the courier company personnel get off work, customers cannot place an order. They can only start placing orders when the courier company is at work the next day. The efficiency is very low and it cannot meet customer needs.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A method and system for intelligently entering addresses in express delivery scenarios
  • A method and system for intelligently entering addresses in express delivery scenarios
  • A method and system for intelligently entering addresses in express delivery scenarios

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0046] refer to figure 1 , a method for intelligently entering an address in an express delivery scene proposed by the present invention, comprising the following steps:

[0047] S1. Train multiple address extraction models, and establish an address library, which is used to store known addresses.

[0048] S2. Obtain the voice information of the address broadcast by the customer.

[0049] S3. Acquire the transcribed text according to the voice information.

[0050] S4. Match the transcribed text with the address extraction model, extract address information, and obtain a known address in the address database that matches the address information as a verification address.

[0051] In this way, through step S4, the finally obtained verified address has been trained by the address extraction model and verified by the address library, which greatly improves the accuracy of the verified address.

[0052] S5. Synthesize the verification address into a voice address and broadcast ...

Embodiment 2

[0059] In this embodiment, relative to Embodiment 1, step S4 specifically includes the following steps:

[0060] S41. Match the text information with the address extraction model to extract address information.

[0061] S42. Synthesize the extracted address information into voice information and broadcast it to the client.

[0062] S43. Obtain the customer's feedback on the voice information, and determine whether the address information is correct; if it is not correct, execute step S7.

[0063] S44. If it is correct, obtain a known address in the address database that matches the address information as the verification address.

[0064]In this way, in this embodiment, after the address information is extracted according to the address extraction model, the address information is confirmed to the customer through a voice call, which ensures the accuracy of the address information used for matching with the address database, which is conducive to improving the matching rate a...

Embodiment 3

[0066] In this embodiment, compared with embodiment 1, in step S1, a bidirectional LSTM neural network is used to train the address extraction model. Step S1 is specifically as follows: first, a plurality of customer voice broadcast templates, and then extract known addresses from the address library to fill in each voice broadcast template to obtain training corpus. The training prediction is sent to the bidirectional LSTM neural network for training, and the parameters of each node on the bidirectional LSTM neural network are obtained as slot labels. Step S4 is specifically: input the obtained transcribed text into the bidirectional LSTM neural network, convert each text into a text vector, and then obtain the slot label of each text through the bidirectional LSTM neural network calculation.

[0067] Specifically, in this embodiment, the specific way to obtain the parameters of each node on the bidirectional LSTM neural network as the slot label is: send the training forecas...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

A method for intelligently entering an address in an express delivery scene proposed by the present invention is characterized in that it comprises the following steps: training a plurality of address extraction models, and establishing an address library, the address library is used to store known addresses; obtaining voice information of the customer's broadcast address; Obtain the transcribed text according to the voice information; match the transcribed text with the address extraction model, extract the address information, and obtain the known address matching the address information in the address database as the verification address; synthesize the verification address into a voice address and send it to Customer broadcast; obtain customer feedback on the voice address, and determine whether the verification address is correct. In the present invention, the finally obtained verification address is trained by the address extraction model and verified by the address library, which greatly improves the accuracy of the verification address. In the present invention, by broadcasting the voice address to the client, and further verifying the verification address directly by the client, the accuracy of the final mailing address is guaranteed.

Description

technical field [0001] The invention relates to the technical field of express delivery, in particular to a method for intelligently entering an address in an express delivery scene. Background technique [0002] The traditional method of placing an order in the express delivery scenario is: the customer calls the express company, the artificial background answers the call, and through the interaction between people, complete the registration of the express delivery order. [0003] This method requires the courier company to prepare a lot of manpower to answer the phone, and the labor cost is very high. For example, if 10,000 customers call and place an order at the same time, the courier company needs to prepare 10,000 manpower to answer the phone, otherwise the customer will not be able to place the order normally. one. [0004] In addition, after the courier company personnel get off work, customers cannot place an order. They have to wait until the courier company goes ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G10L15/26G10L13/04G10L13/08G06F40/295
Inventor 宗升亚卫海智赵发君
Owner 科讯嘉联信息技术有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products