ET-USB: Transformer-Based Sequential Behavior Modeling for Inbound Customer Service

12/20/2019
by   Ta-Chun Su, et al.
0

Deep-Learning based models with attention mechanism has achieved exceptional results for many tasks, including language tasks and recommender system. In this study, we focus on inbound calls prediction for customer service, whereas previous works put emphasis on the phone agents allocation. Furthermore, a common method to deal with user's history behaviors is to extract all kinds of aggregated features over time. Nevertheless, the method may fail to detect users' sequential pattern. Therefore, we incorporate users' sequential features along with non-sequential features and apply the encoder of Transformer, one such powerful self-attention network model, to capture the information underlying those behavior sequence. We propose our approach: ET-USB, which is helpful for different business scenarios in Cathay Financial Holdings. In practice, we test and prove that the proposed network structure for processing different dimension of behavior data could demonstrate the superiority comparing with other deep-learning based models.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset