Learning paradigms for large language models (LLMs) currently tend to fa...
With the increased developments in quantum computing, the availability o...
Drones have been widely used in many areas of our daily lives. It reliev...
The volume of User Generated Content (UGC) has increased in recent years...
Quantum annealers are specialized quantum computers for solving combinat...
Large language models (LLMs) have demonstrated remarkable performance ac...
In this paper, we investigate the realization of covert communication in...
Autonomous navigation in unknown environments with obstacles remains
cha...
Over-the-air computation (AirComp), as a data aggregation method that ca...
Trustworthy answer content is abundant in many high-resource languages a...
Multilingual sequence-to-sequence models perform poorly with increased
l...
The recent LLMs like GPT-4 and PaLM-2 have made tremendous progress in
s...
Data scarcity is a crucial issue for the development of highly multiling...
The field of text-to-image (T2I) generation has garnered significant
att...
In recent years, large language models (LLMs) have made significant prog...
In this paper, we examine the computational complexity of enumeration in...
Replanning in temporal logic tasks is extremely difficult during the onl...
In recent years, pre-trained large language models have demonstrated
rem...
Recent years witnessed the breakthrough of face recognition with deep
co...
Recently, there has been significant progress in teaching language model...
Diffusion Weighted Imaging (DWI) is an advanced imaging technique common...
We consider novelty detection in time series with unknown and nonparamet...
Recent work on tokenizer-free multilingual pretrained models show promis...
Widely used deep learning models are found to have poor robustness. Litt...
Deep supervision, or known as 'intermediate supervision' or 'auxiliary
s...
Generating a test suite for a quantum program such that it has the maxim...
The performance of multilingual pretrained models is highly dependent on...
Natural language inference (NLI) is an important task for producing usef...
Adapters are light-weight modules that allow parameter-efficient fine-tu...
Time is an important dimension in our physical world. Lots of facts can
...
An innovations sequence of a time series is a sequence of independent an...
Recently developed large pre-trained language models, e.g., BERT, have
a...
Although deep learning models have driven state-of-the-art performance o...
With populations ageing, the number of people with dementia worldwide is...
Multilingual pretrained representations generally rely on subword
segmen...
To mitigate the negative effect of low quality training data on the
perf...
Back-translation is an effective strategy to improve the performance of
...
Broader transparency in descriptions of and communication regarding AI
s...
To improve the performance of Neural Machine Translation (NMT) for
low-r...
When training multilingual machine translation (MT) models that can tran...
We present a deep generative model for unsupervised text style transfer ...
To acquire a new skill, humans learn better and faster if a tutor, based...
Neural sequence to sequence models are well established for applications...
Neural networks are known to be data hungry and domain sensitive, but it...
To improve low-resource Neural Machine Translation (NMT) with multilingu...
In this paper, we describe compare-mt, a tool for holistic analysis and
...
This paper describes the ARIEL-CMU submissions to the Low Resource Human...
Multilingual training of neural machine translation (NMT) systems has le...
Recent advances in Neural Machine Translation (NMT) show that adding
syn...
In this work, we examine methods for data augmentation for text-based ta...