Sort by
Most Relevant
Most Popular
Least Popular
Most Recent
Least Recent

Total Results 3,409

*
*
Learning Mahalanobis metric spaces is an important problem that has found numerous applications. Several algorithms have been designed for this problem, including Information Theoretic Metric Learning (ITML) [Davis et al. 2007] and Large Margin Nearest ...
Learning Mahalanobis metric spaces is an important problem that has found numerous applications. Several algorithms have been designed for this problem, including Information Theoretic Metric Learning (ITML) [Davis et al.

*
*
We propose Support-guided Adversarial Imitation Learning (SAIL), a generic imitation learning framework that unifies support estimation of the expert policy with the family of Adversarial Imitation Learning (AIL) algorithms. SAIL addresses two important ...
We propose Support-guided Adversarial Imitation Learning (SAIL), a generic imitation learning framework that unifies support estimation of the expert policy with the family of Adversarial Imitation Learning (AIL) algorithms.

*
*
Despite the rapid development of adversarial attacks for machine learning models, many types of new adversarial examples still remain unknown. Uncovered types of adversarial attacks pose serious concern for the safety of the models, which raise the ...
Despite the rapid development of adversarial attacks for machine learning models, many types of new adversarial examples still remain unknown. Uncovered types of adversarial attacks pose serious concern for the safety of the models, which raise the question about the effectiveness of current adversarial robustness evaluation... (read more)

*
*
Transfer learning has proven to be a successful way to train high performing deep learning models in various applications for which little labeled data is available. In transfer learning, one pre-trains the model on a large dataset such as Imagenet or MS- ...
Transfer learning has proven to be a successful way to train high performing deep learning models in various applications for which little labeled data is available.

Published At:
2020-06-23

Tasks: Semantic Segmentation, Transfer Learning, Object Detection, Image Classification, Instance Segmentation

Authors: Anonymous

Tasks: Semantic Segmentation, Transfer Learning, Object Detection, Image Classification, Instance Segmentation

Authors: Anonymous

*
*
In a continual learning setting, new categories may be introduced over time, and an ideal learning system should perform well on both the original categories and the new categories. While deep neural nets have achieved resounding success in the classical ...
In a continual learning setting, new categories may be introduced over time, and an ideal learning system should perform well on both the original categories and the new categories.

*
*
Learning from limited exemplars (few-shot learning) is a fundamental, unsolved problem that has been laboriously explored in the machine learning community. However, current few-shot learners are mostly supervised and rely heavily on a large amount of ...
Learning from limited exemplars (few-shot learning) is a fundamental, unsolved problem that has been laboriously explored in the machine learning community.

Published At:
2020-06-23

Tasks: Person Re-Identification, Omniglot, Few-Shot Learning

Authors: Anonymous

Tasks: Person Re-Identification, Omniglot, Few-Shot Learning

Authors: Anonymous

*
*
Image paragraph captioning is the task of automatically generating multiple sentences for describing images in grain-fined and coherent text. Existing typical deep learning-based models for image captioning consist of an image encoder to extract visual ...
Image paragraph captioning is the task of automatically generating multiple sentences for describing images in grain-fined and coherent text. Existing typical deep learning-based models for image captioning consist of an image encoder to extract visual features and a language model decoder, which has shown promising results in single high-level sentence generation... (read more)

Published At:
2020-06-23

Tasks: Language Modelling, Image Captioning, Image Paragraph Captioning

Authors: Anonymous

Tasks: Language Modelling, Image Captioning, Image Paragraph Captioning

Authors: Anonymous

*
*
Discovering the underlying mathematical expressions describing a dataset is a core challenge for artificial intelligence. This is the problem of symbolic regression... (read more)
Discovering the underlying mathematical expressions describing a dataset is a core challenge for artificial intelligence.

*
*
Long-term video prediction is highly challenging since it entails simultaneously capturing spatial and temporal information across a long range of image frames.Standard recurrent models are ineffective since they are prone to error propagation and cannot ...
Long-term video prediction is highly challenging since it entails simultaneously capturing spatial and temporal information across a long range of image frames.

*
*
Representation learning promises to unlock deep learning for the long tail of vision tasks without expansive labelled datasets. Yet, the absence of a unified yardstick to evaluate general visual representations hinders progress... (read more)
Representation learning promises to unlock deep learning for the long tail of vision tasks without expansive labelled datasets.

*
*
We address the problem of reconstructing a matrix from a subset of its entries. Current methods, branded as geometric matrix completion, augment classical rank regularization techniques by incorporating geometric information into the solution... (read more ...
We address the problem of reconstructing a matrix from a subset of its entries.

*
*
Deep energy-based models are powerful, but pose challenges for learning and inference (Belanger & McCallum, 2016). Tu & Gimpel (2018) developed an efficient framework for energy-based models by training “inference networks” to approximate structured ...
Deep energy-based models are powerful, but pose challenges for learning and inference (Belanger & McCallum, 2016).

*
*
Current state-of-the-art results in multilingual natural language inference (NLI) are based on tuning XLM (a pre-trained polyglot language model) separately for each language involved, resulting in multiple models. We reach significantly higher NLI results ...
Current state-of-the-art results in multilingual natural language inference (NLI) are based on tuning XLM (a pre-trained polyglot language model) separately for each language involved, resulting in multiple models. We reach significantly higher NLI results with a single model for all languages via multilingual tuning... (read more)

Published At:
2020-06-23

Tasks: Language Modelling, Natural Language Inference, Sentence Embeddings

Authors: Anonymous

Tasks: Language Modelling, Natural Language Inference, Sentence Embeddings

Authors: Anonymous

*
*
We consider statistical learning problems, when the distribution P′
of the training observations Z
′
1
,…,Z
′
n
differs from the distribution P
involved in the risk one seeks to minimize (referred to as the \textit{test distribution}) but is still defined ...
We consider statistical learning problems, when the distribution P′ of the training observations Z ′ ,…,Z ′ n differs from the distribution P involved in the risk one seeks to minimize (referred to as the \textit{test distribution}) but is still defined on the same measurable space as P and dominates it.

*
*
Robustness is an important property to guarantee the security of machine learning models. It has recently been demonstrated that strong robustness certificates can be obtained on ensemble classifiers generated by input randomization...
(read more)
Robustness is an important property to guarantee the security of machine learning models.

*
*
Gradient-based meta-learning algorithms require several steps of gradient descent to adapt to newly incoming tasks. This process becomes more costly as the number of samples increases... (read more)
Gradient-based meta-learning algorithms require several steps of gradient descent to adapt to newly incoming tasks.

*
*
It is fundamental and challenging to train robust and accurate Deep Neural Networks (DNNs) when semantically abnormal examples exist. Although great progress has been made, there is still one crucial research question which is not thoroughly explored yet: ...
It is fundamental and challenging to train robust and accurate Deep Neural Networks (DNNs) when semantically abnormal examples exist.

*
*
In this paper we prove new universal approximation theorems for deep learning on point clouds that do not assume fixed cardinality. We do this by first generalizing the classical universal approximation theorem to general compact Hausdorff spaces and then ...
In this paper we prove new universal approximation theorems for deep learning on point clouds that do not assume fixed cardinality.

*
*
Convolutional neural networks (CNNs) excel in image recognition and generation. Among many efforts to explain their effectiveness, experiments show that CNNs carry strong inductive biases that capture natural image priors... (read more)
Convolutional neural networks (CNNs) excel in image recognition and generation.

*
*
Deep Infomax~(DIM) is an unsupervised representation learning framework by maximizing the mutual information between the inputs and the outputs of an encoder, while probabilistic constraints are imposed on the outputs. In this paper, we propose Supervised ...
Deep Infomax~(DIM) is an unsupervised representation learning framework by maximizing the mutual information between the inputs and the outputs of an encoder, while probabilistic constraints are imposed on the outputs.

Published At:
2020-06-23

Tasks: Representation Learning, Unsupervised Representation Learning

Authors: Anonymous

Tasks: Representation Learning, Unsupervised Representation Learning

Authors: Anonymous