Extensive studies have shown that deep learning models are vulnerable to...
Human-centric perceptions (e.g., pose estimation, human parsing, pedestr...
Data in real-world object detection often exhibits the long-tailed
distr...
Long-tail distribution is widely spread in real-world applications. Due ...
Transformer architecture has become the fundamental element of the wides...
Multi-Object Tracking (MOT) is one of the most fundamental computer visi...
Larger deep learning models usually lead to higher model quality with an...
Recently, post-training quantization (PTQ) has driven much attention to
...
Despite the recent success of long-tailed object detection, almost all
l...
Model quantization has emerged as an indispensable technique to accelera...
Recently, large-scale Contrastive Language-Image Pre-training (CLIP) has...
Deep neural networks (DNNs) are vulnerable to adversarial noises, which
...
Motivated by the success of Transformers in natural language processing ...
Quantization has emerged as one of the most prevalent approaches to comp...
We study the challenging task of neural network quantization without
end...
User data confidentiality protection is becoming a rising challenge in t...
Network quantization has rapidly become one of the most widely used meth...
Recently low-bit (e.g., 8-bit) network quantization has been extensively...
Weight and activation binarization is an effective approach to deep neur...
Hardware-friendly network quantization (e.g., binary/uniform quantizatio...
Detection and learning based appearance feature play the central role in...