In-Context learning is the paradigm that adapts large language models to...
Compositional generalization–understanding unseen combinations of seen
p...
Abstraction is a desirable capability for deep learning models, which me...
With the rapid development of pre-training techniques, a number of langu...
The task of generating code solutions for a given programming problem ca...
Code generation is a longstanding challenge, aiming to generate a code
s...
Large language models such as GPT-3 and PaLM have shown remarkable
perfo...
Recently the prompt-tuning paradigm has attracted significant attention....
Reasoning over natural language is a long-standing goal for the research...
Recent years pre-trained language models hit a success on modeling natur...
Neural sequence models exhibit limited compositional generalization abil...
Neural semantic parsers usually fail to parse long and complex utterance...
Human intelligence exhibits compositional generalization (i.e., the capa...
We formalize human language understanding as a structured prediction tas...
Compositional generalization is a basic but essential intellective capab...