Placing a human in the loop may abate the risks of deploying AI systems ...
Conducting experiments with diverse participants in their native languag...
Understanding the extent to which the perceptual world can be recovered ...
Should we care whether AI systems have representations of the world that...
Learning transferable representations by training a classifier is a
well...
Diffusion models are a class of generative models that learn to synthesi...
Recent advances in multimodal training use textual descriptions to
signi...
Similarity judgments provide a well-established method for accessing men...
Being able to learn from small amounts of data is a key characteristic o...
Increasingly large datasets are rapidly driving up the computational cos...
Using prototype methods to reduce the size of training datasets can
dras...
We leverage what are typically considered the worst qualities of deep
le...
Deep neural networks require large training sets but suffer from high
co...
Dataset distillation is a method for reducing dataset sizes: the goal is...
Most real-world datasets, and particularly those collected from physical...