Continuous Prompt Tuning Based Textual Entailment Model for E-commerce Entity Typing

Type: Article

Publication Date: 2022-12-17

Citations: 6

DOI: https://doi.org/10.1109/bigdata55660.2022.10020766

Abstract

The explosion of e-commerce has caused the need for processing and analysis of product titles, like entity typing in product titles. However, the rapid activity in e-commerce has led to the rapid emergence of new entities, which is difficult for general entity typing. Besides, product titles in e-commerce have very different language styles from text data in general domain. In order to handle new entities in product titles and address the special language styles of product titles in e-commerce domain, we propose our textual entailment model with continuous prompt tuning based hypotheses and fusion embeddings for e-commerce entity typing. First, we reformulate entity typing into a textual entailment problem to handle new entities that are not present during training. Second, we design a model to automatically generate textual entailment hypotheses using a continuous prompt tuning method, which can generate better textual entailment hypotheses without manual design. Third, we utilize the fusion embeddings of BERT embedding and Char-acterBERT embedding to solve the problem that the language styles of product titles in e-commerce are different from that of general domain. To analyze the effect of each contribution, we compare the performance of entity typing and textual entailment model, and conduct ablation studies on continuous prompt tuning and fusion embeddings. We also evaluate the impact of different prompt template initialization for the continuous prompt tuning. We show our proposed model improves the average F1 score by around 2% compared to the baseline BERT entity typing model.

Locations

  • arXiv (Cornell University) - View - PDF
  • 2021 IEEE International Conference on Big Data (Big Data) - View

Similar Works

Action Title Year Authors
+ Continuous Prompt Tuning Based Textual Entailment Model for E-commerce Entity Typing 2022 Yibo Wang
Congying Xia
Guan Wang
Philip S. Yu
+ Prompt-based Text Entailment for Low-Resource Named Entity Recognition 2022 Dongfang Li
Baotian Hu
Qingcai Chen
+ E-BERT: A Phrase and Product Knowledge Enhanced Language Model for E-commerce 2020 Denghui Zhang
Zixuan Yuan
Yanchi Liu
Zuohui Fu
Fuzhen Zhuang
Pengyang Wang
Haifeng Chen
Hui Xiong
+ Improving Text Matching in E-Commerce Search with A Rationalizable, Intervenable and Fast Entity-Based Relevance Model 2023 Jiong Cai
Yong Jiang
Yue Zhang
Chenyue Jiang
Ke Yu
Jianhui Ji
Rong Xiao
Haihong Tang
Tao Wang
Zhongqiang Huang
+ PDF Chat APrompt4EM: Augmented Prompt Tuning for Generalized Entity Matching 2024 Yikuan Xia
J.M. Chen
Xinchi Li
Jun Gao
+ PDF Chat A Semantic Mention Graph Augmented Model for Document-Level Event Argument Extraction 2024 Jian Zhang
Changlin Yang
Haiping Zhu
Qika Lin
Fangzhi Xu
Jun S. Liu
+ Pre-training Tasks for User Intent Detection and Embedding Retrieval in E-commerce Search 2022 Yiming Qiu
Chenyu Zhao
Han Zhang
Jingwei Zhuo
Tianhao Li
Xiaowei Zhang
Songlin Wang
Sulong Xu
Bo Long
Wenyun Yang
+ PDF Chat Bi-Directional Iterative Prompt-Tuning for Event Argument Extraction 2022 Lu Dai
Bang Wang
Wei Xiang
Yijun Mo
+ How does prompt engineering affect ChatGPT performance on unsupervised entity resolution? 2023 Khanin Sisaengsuwanchai
Navapat Nananukul
Mayank Kejriwal
+ LaTeX-Numeric: Language-agnostic Text attribute eXtraction for E-commerce Numeric Attributes 2021 Kartik Mehta
Ioana Oprea
Nikhil Rasiwasia
+ A Sequence to Sequence Model for Extracting Multiple Product Name Entities from Dialog 2021 Praneeth Gubbala
Xuan Zhang
+ Bi-Directional Iterative Prompt-Tuning for Event Argument Extraction 2022 Lu Dai
Bang Wang
Wei Xiang
Yijun Mo
+ PromptNER: Prompt Locating and Typing for Named Entity Recognition 2023 Yongliang Shen
Zeqi Tan
Shuhui Wu
Wenqi Zhang
Rongsheng Zhang
Yadong Xi
Weiming LĂź
Yueting Zhuang
+ LLaMA-E: Empowering E-commerce Authoring with Multi-Aspect Instruction Following 2023 Kaize Shi
Xueyao Sun
Dingxian Wang
Yinlin Fu
Guandong Xu
Qing Li
+ PDF Chat Pre-training Tasks for User Intent Detection and Embedding Retrieval in E-commerce Search 2022 Yiming Qiu
Chenyu Zhao
Han Zhang
Jingwei Zhuo
Tianhao Li
Xiaowei Zhang
Songlin Wang
Sulong Xu
Bo Long
Wen-Yun Yang
+ Generative Entity Typing with Curriculum Learning 2022 Siyu Yuan
Deqing Yang
Jiaqing Liang
Zhixu Li
Jinxi Liu
Jingyue Huang
Yanghua Xiao
+ PDF Chat Generative Entity Typing with Curriculum Learning 2022 Siyu Yuan
Deqing Yang
Jiaqing Liang
Zhixu Li
Jinxi Liu
Jingyue Huang
Yanghua Xiao
+ Hybrid Attention-Based Transformer Block Model for Distant Supervision Relation Extraction 2020 Yan Xiao
Yaochu Jin
Ran Cheng
Kuangrong Hao
+ An Effective System for Multi-format Information Extraction 2021 Yaduo Liu
Longhui Zhang
Shujuan Yin
Xiaofeng Zhao
Feiliang Ren
+ PDF Chat An Effective System for Multi-format Information Extraction 2021 Yaduo Liu
Longhui Zhang
Shujuan Yin
Xiaofeng Zhao
Feiliang Ren

Works Cited by This (13)

Action Title Year Authors
+ PDF Chat Character-Aware Neural Language Models 2016 Yoon Kim
Yacine Jernite
David Sontag
Alexander M. Rush
+ PDF Chat Character-level language modeling with hierarchical recurrent neural networks 2017 Kyuyeon Hwang
Wonyong Sung
+ Benchmarking Zero-shot Text Classification: Datasets, Evaluation and Entailment Approach 2019 Wenpeng Yin
Jamaal Hay
Dan Roth
+ PDF Chat CharBERT: Character-aware Pre-trained Language Model 2020 Wentao Ma
Yiming Cui
Chenglei Si
Ting Liu
Shijin Wang
Guoping Hu
+ AutoPrompt: Eliciting Knowledge from Language Models with Automatically Generated Prompts 2020 Taylor Shin
Yasaman Razeghi
Robert L. Logan
Eric Wallace
Sameer Singh
+ PDF Chat CharacterBERT: Reconciling ELMo and BERT for Word-Level Open-Vocabulary Representations From Characters 2020 Hicham El Boukkouri
Olivier Ferret
Thomas Lavergne
Hiroshi Noji
Pierre Zweigenbaum
Jun’ichi Tsujii
+ PDF Chat <scp>Canine</scp>: Pre-training an Efficient Tokenization-Free Encoder for Language Representation 2022 Jonathan H. Clark
Dan Garrette
Iulia Turc
John Wieting
+ Charformer: Fast Character Transformers via Gradient-based Subword Tokenization 2021 Yi Tay
Vinh Q. Tran
Sebastian Ruder
Jai Prakash Gupta
Hyung Won Chung
Dara Bahri
Zhen Qin
Simon Baumgartner
Cong Yu
Donald Metzler
+ PDF Chat Pre-train, Prompt, and Predict: A Systematic Survey of Prompting Methods in Natural Language Processing 2022 Pengfei Liu
Weizhe Yuan
Jinlan Fu
Zhengbao Jiang
Hiroaki Hayashi
Graham Neubig
+ PDF Chat The Power of Scale for Parameter-Efficient Prompt Tuning 2021 Brian Lester
Rami Al‐Rfou
Noah Constant