Incorporating Pseudo-Parallel Data for Quantifiable Sequence Editing

04/19/2018
by   Yi Liao, et al.
0

In the task of quantifiable sequence editing (QuaSE), a model needs to edit an input sentence to generate an output that satisfies a given outcome, which is a numerical value measuring a certain property of the output. For example, for review sentences, the outcome could be review ratings; for advertisement, the outcome could be click-through rate. We propose a framework which performs QuaSE by incorporating pseudo-parallel data. Our framework can capture the content similarity and the outcome differences by exploiting pseudo-parallel sentence pairs, which enables a better disentanglement of the latent factors that are relevant to the outcome and thus provides a solid basis to generate output satisfying the desired outcome. The dual reconstruction structure further enhances the capability of generating expected output by exploiting the coupling of latent factors of pseudo-parallel sentences. We prepare a dataset of Yelp review sentences with the ratings as outcome. Experimental results show that our framework can outperform state-of-the-art methods under both sentiment polarity accuracy and target value errors.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro