Sort by
Most Relevant
Most Popular
Least Popular
Most Recent
Least Recent

Total Results 1,314

*
*
A fundamental question in many data analysis settings is the problem of
discerning the "natural" dimension of a data set. That is, when a data set is
drawn from a manifold (possibly with noise), a meaningful aspect of the data is
the dimension of that ...
A fundamental question in many data analysis settings is the problem of discerning the "natural" dimension of a data set. Various approaches exist for estimating this dimension, such as the method of Secant-Avoidance Projection (SAP).

Published At:
2020-06-12

Tasks: Dimensionality Reduction

Authors: Henry Kvinge, Elin Farnell, Michael Kirby, Chris Peterson

Tasks: Dimensionality Reduction

Authors: Henry Kvinge, Elin Farnell, Michael Kirby, Chris Peterson

*
*
Current popular methods for Magnetic Resonance Fingerprint (MRF) recovery are
bottlenecked by the heavy computations of a matched-filtering step due to the
growing size and complexity of the fingerprint dictionaries in multi-parametric
quantitative MRI ...
Current popular methods for Magnetic Resonance Fingerprint (MRF) recovery are bottlenecked by the heavy computations of a matched-filtering step due to the growing size and complexity of the fingerprint dictionaries in multi-parametric quantitative MRI applications. Cover trees are robust against the curse-of-dimensionality and therefore CoverBLIP provides a notion of scalability -- a consistent gain in time-accuracy performance-- for searching high-dimensional atoms which may not be easily preprocessed (i.e. for dimensionality reduction) due to the increasing degrees of non-linearities appearing in the emerging multi-parametric MRF dictionaries.

Published At:
2020-06-12

Tasks: Dimensionality Reduction

3 starred

Authors: Mohammad Golbabaee, Zhouye Chen, Yves Wiaux, Mike Davies

Tasks: Dimensionality Reduction

3 starred

Authors: Mohammad Golbabaee, Zhouye Chen, Yves Wiaux, Mike Davies

*
*
In this work, we present a quantum neighborhood preserving embedding and a
quantum local discriminant embedding for dimensionality reduction and
classification. We demonstrate that these two algorithms have an exponential
speedup over their respectively ...
In this work, we present a quantum neighborhood preserving embedding and a quantum local discriminant embedding for dimensionality reduction and classification. Along the way, we propose a variational quantum generalized eigenvalue solver that finds the generalized eigenvalues and eigenstates of a matrix pencil $(\mathcal{G},\mathcal{S})$. As a proof-of-principle, we implement our algorithm to solve $2^5\times2^5$ generalized eigenvalue problems.

Published At:
2020-06-12

Tasks: Dimensionality Reduction

Authors: Jin-Min Liang, Shu-Qian Shen, Ming Li, Lei Li

Tasks: Dimensionality Reduction

Authors: Jin-Min Liang, Shu-Qian Shen, Ming Li, Lei Li

*
*
Typical dimensionality reduction methods focus on directly reducing the
number of random variables while retaining maximal variations in the data. In
this paper, we consider the dimensionality reduction in parameter spaces of
binary multivariate ...
Typical dimensionality reduction methods focus on directly reducing the number of random variables while retaining maximal variations in the data. We then revisit Boltzmann machines (BM) and theoretically show that both single-layer BM without hidden units (SBM) and restricted BM (RBM) can be solidly derived using the CIF principle.

Published At:
2020-06-12

Tasks: Dimensionality Reduction, Density Estimation

Authors: Xiaozhao Zhao, Yuexian Hou, Qian Yu, Dawei Song, Wenjie Li

Tasks: Dimensionality Reduction, Density Estimation

Authors: Xiaozhao Zhao, Yuexian Hou, Qian Yu, Dawei Song, Wenjie Li

*
*
This paper puts forth a novel bi-linear modeling framework for data recovery
via manifold-learning and sparse-approximation arguments and considers its
application to dynamic magnetic-resonance imaging (dMRI). Each temporal-domain
MR image is viewed as a ...
This paper puts forth a novel bi-linear modeling framework for data recovery via manifold-learning and sparse-approximation arguments and considers its application to dynamic magnetic-resonance imaging (dMRI). Recovery of the high-fidelity MRI data is realized by solving a non-convex minimization task for the linear decompression operator and those affine combinations of landmark points which locally approximate the latent manifold geometry.

Published At:
2020-06-12

Tasks: Dimensionality Reduction

Authors: Gaurav N. Shetty, Konstantinos Slavakis, Abhishek Bose, Ukash Nakarmi, Gesualdo Scutari, Leslie Ying

Tasks: Dimensionality Reduction

Authors: Gaurav N. Shetty, Konstantinos Slavakis, Abhishek Bose, Ukash Nakarmi, Gesualdo Scutari, Leslie Ying

*
*
Canonical correlation analysis (CCA) is a powerful technique for discovering
whether or not hidden sources are commonly present in two (or more) datasets.
Its well-appreciated merits include dimensionality reduction, clustering,
classification, feature ...
Canonical correlation analysis (CCA) is a powerful technique for discovering whether or not hidden sources are commonly present in two (or more) datasets. The novel gCCA accounts for the graph-induced knowledge of common sources, while minimizing the distance between the wanted canonical variables. One such setting includes kernels that are incorporated to account for nonlinear data dependencies.

Published At:
2020-06-12

Tasks: Dimensionality Reduction, Image Classification, Feature Selection

Authors: Gang Wang, Georgios B. Giannakis, Jia Chen, Yanning Shen

Tasks: Dimensionality Reduction, Image Classification, Feature Selection

Authors: Gang Wang, Georgios B. Giannakis, Jia Chen, Yanning Shen

*
*
We introduce Geomstats, an open-source Python toolbox for computations and
statistics on nonlinear manifolds, such as hyperbolic spaces, spaces of
symmetric positive definite matrices, Lie groups of transformations, and many
more. We provide object- ...
We introduce Geomstats, an open-source Python toolbox for computations and statistics on nonlinear manifolds, such as hyperbolic spaces, spaces of symmetric positive definite matrices, Lie groups of transformations, and many more. Statistics and learning algorithms provide methods for estimation, clustering and dimension reduction on manifolds.

Published At:
2020-06-12

Tasks: Dimensionality Reduction

Authors: Nina Miolane, Alice Le Brigant, Johan Mathe, Benjamin Hou, Nicolas Guigui, Yann Thanwerdas, Stefan Heyder, Olivier Peltre, Niklas Koep, Hadi Zaatiti, Hatem Hajri, ...

Tasks: Dimensionality Reduction

Authors: Nina Miolane, Alice Le Brigant, Johan Mathe, Benjamin Hou, Nicolas Guigui, Yann Thanwerdas, Stefan Heyder, Olivier Peltre, Niklas Koep, Hadi Zaatiti, Hatem Hajri, ...

*
*
Graph Neural Networks (GNNs) are emerging machine learning models on graphs.
Although sufficiently deep GNNs are shown theoretically capable of fully
preserving graph structures, most existing GNN models in practice are shallow
and essentially feature- ...
Graph Neural Networks (GNNs) are emerging machine learning models on graphs. Without needing to increase depths, Eigen-GNN possesses more flexibilities in handling both feature-driven and structure-driven tasks since the initial bases contain both node features and graph structures.

Published At:
2020-06-12

Tasks: Dimensionality Reduction, Link Prediction, Node Classification

Authors: Xin Wang, Jian Pei, Ziwei Zhang, Peng Cui, Wenwu Zhu

Tasks: Dimensionality Reduction, Link Prediction, Node Classification

Authors: Xin Wang, Jian Pei, Ziwei Zhang, Peng Cui, Wenwu Zhu

*
*
Symmetric positive definite (SPD) matrices used as feature descriptors in
image recognition are usually high dimensional. Traditional manifold learning
is only applicable for reducing the dimension of high-dimensional vector-form
data. For high-dimensional ...
Symmetric positive definite (SPD) matrices used as feature descriptors in image recognition are usually high dimensional. Traditional manifold learning is only applicable for reducing the dimension of high-dimensional vector-form data. To overcome this limitation, we propose a new dimension reduction algorithm on SPD matrix space to transform high-dimensional SPD matrices into low-dimensional SPD matrices.

Published At:
2020-06-12

Tasks: Dimensionality Reduction, Temporal Action Localization, Face Recognition

Authors: Yangyang Li, Ruqian Lu

Tasks: Dimensionality Reduction, Temporal Action Localization, Face Recognition

Authors: Yangyang Li, Ruqian Lu

*
*
Principal Component Analysis (PCA) and Kernel Principal Component Analysis
(KPCA) are fundamental methods in machine learning for dimensionality
reduction. The former is a technique for finding this approximation in finite
dimensions and the latter is ...
Principal Component Analysis (PCA) and Kernel Principal Component Analysis (KPCA) are fundamental methods in machine learning for dimensionality reduction. The intrinsic Grassmann average of these subspaces are shown to coincide with the principal components of the observations when they are drawn from a Gaussian distribution.

Published At:
2020-06-12

Tasks: Dimensionality Reduction

Authors: Rudrasis Chakraborty, SÃ¸ren Hauberg, Baba C. Vemuri

Tasks: Dimensionality Reduction

Authors: Rudrasis Chakraborty, SÃ¸ren Hauberg, Baba C. Vemuri

*
*
Part 2 of this monograph builds on the introduction to tensor networks and
their operations presented in Part 1. It focuses on tensor network models for
super-compressed higher-order representation of data/parameters and related
cost functions, while ...
Part 2 of this monograph builds on the introduction to tensor networks and their operations presented in Part 1. Through a graphical approach, we also elucidate how, by virtue of the underlying low-rank tensor approximations and sophisticated contractions of core tensors, tensor networks have the ability to perform distributed computations on otherwise prohibitively large volumes of data/parameters, thereby alleviating or even eliminating the curse of dimensionality.

Published At:
2020-06-12

Tasks: Dimensionality Reduction, Tensor Networks

12 starred

Authors: A. Cichocki, A-H. Phan, Q. Zhao, N. Lee, I. V. Oseledets, M. Sugiyama, D. Mandic

Tasks: Dimensionality Reduction, Tensor Networks

12 starred

Authors: A. Cichocki, A-H. Phan, Q. Zhao, N. Lee, I. V. Oseledets, M. Sugiyama, D. Mandic

*
*
We propose a novel probabilistic dimensionality reduction framework that can
naturally integrate the generative model and the locality information of data.
Based on this framework, we present a new model, which is able to learn a
smooth skeleton of ...
We propose a novel probabilistic dimensionality reduction framework that can naturally integrate the generative model and the locality information of data. This interpretation motivates the learning of the embedding points that can directly form an explicit graph structure.

*
*
Discovering patterns of the complex high-dimensional data is a long-standing
problem. Dimension Reduction (DR) and Intrinsic Dimension Estimation (IDE) are
two fundamental thematic programs that facilitate geometric understanding of
the data. We present ...
Discovering patterns of the complex high-dimensional data is a long-standing problem. We present Rdimtools - an R package that supports 133 DR and 17 IDE algorithms whose extent makes multifaceted scrutiny of the data in one place easier.

*
*
With the development of multimedia era, multi-view data is generated in
various fields. Contrast with those single-view data, multi-view data brings
more useful information and should be carefully excavated. Therefore, it is
essential to fully exploit the ...
With the development of multimedia era, multi-view data is generated in various fields. Therefore, it is essential to fully exploit the complementary information embedded in multiple views to enhance the performances of many tasks.

Published At:
2020-06-12

Tasks: Dimensionality Reduction

Authors: Huibing Wang, Haohao Li, Xianping Fu

Tasks: Dimensionality Reduction

Authors: Huibing Wang, Haohao Li, Xianping Fu

*
*
We introduce an approach based on the Givens representation for posterior
inference in statistical models with orthogonal matrix parameters, such as
factor models and probabilistic principal component analysis (PPCA). We show
how the Givens representation ...
We introduce an approach based on the Givens representation for posterior inference in statistical models with orthogonal matrix parameters, such as factor models and probabilistic principal component analysis (PPCA). We also discuss how our Givens representation can be used to define general classes of distributions over the space of orthogonal matrices.

Published At:
2020-06-12

Tasks: Dimensionality Reduction, Bayesian Inference

Authors: Arya A Pourzanjani, Richard M Jiang, Brian Mitchell, Paul J Atzberger, Linda R Petzold

Tasks: Dimensionality Reduction, Bayesian Inference

Authors: Arya A Pourzanjani, Richard M Jiang, Brian Mitchell, Paul J Atzberger, Linda R Petzold

*
*
Principal component analysis (PCA) is a fundamental dimension reduction tool
in statistics and machine learning. For large and high-dimensional data,
computing the PCA (i.e., the singular vectors corresponding to a number of
dominant singular values of the ...
Principal component analysis (PCA) is a fundamental dimension reduction tool in statistics and machine learning. For a set of high-dimensional data stored as a 150 GB file, the proposed algorithm is able to compute the first 50 principal components in just 24 minutes on a typical 24-core computer, with less than 1 GB memory cost.

Published At:
2020-06-12

Tasks: Dimensionality Reduction

6 starred

Authors: Wenjian Yu, Yu Gu, Jian Li, Shenghua Liu, Yaohang Li

Tasks: Dimensionality Reduction

6 starred

Authors: Wenjian Yu, Yu Gu, Jian Li, Shenghua Liu, Yaohang Li

*
*
Advances in molecular "omics'" technologies have motivated new methodology
for the integration of multiple sources of high-content biomedical data.
However, most statistical methods for integrating multiple data matrices only
consider data shared ...
Advances in molecular "omics'" technologies have motivated new methodology for the integration of multiple sources of high-content biomedical data.

Published At:
2020-06-12

Tasks: Dimensionality Reduction

3 starred

Authors: Jun Young Park, Eric F. Lock

Tasks: Dimensionality Reduction

3 starred

Authors: Jun Young Park, Eric F. Lock

*
*
Kernel learning is a powerful framework for nonlinear data modeling. Using the kernel trick, a number of problems have been formulated as semidefinite programs (SDPs)...
(read more)
Kernel learning is a powerful framework for nonlinear data modeling.

Published At:
2020-06-12

Tasks: Dimensionality Reduction

Authors: Xiao-Ming Wu, Anthony M. So, Zhenguo Li, Shuo-Yen R. Li

Tasks: Dimensionality Reduction

Authors: Xiao-Ming Wu, Anthony M. So, Zhenguo Li, Shuo-Yen R. Li

*
*
Hyperspectral images (HSI) contain a wealth of information over hundreds of
contiguous spectral bands, making it possible to classify materials through
subtle spectral discrepancies. However, the classification of this rich
spectral information is ...
Hyperspectral images (HSI) contain a wealth of information over hundreds of contiguous spectral bands, making it possible to classify materials through subtle spectral discrepancies. In S3RMLSC, a hierarchical guided filter (HGF) is initially used to smoothen the pixels of the HSI data to preserve the spatial pixel consistency.

Published At:
2020-06-12

Tasks: Dimensionality Reduction

Authors: Ramanarayan Mohanty, Sl Happy, Aurobinda Routray

Tasks: Dimensionality Reduction

Authors: Ramanarayan Mohanty, Sl Happy, Aurobinda Routray

*
*
We develop theory for nonlinear dimensionality reduction (NLDR). A number of
NLDR methods have been developed, but there is limited understanding of how
these methods work and the relationships between them. There is limited basis
for using existing NLDR ...
We develop theory for nonlinear dimensionality reduction (NLDR). A number of NLDR methods have been developed, but there is limited understanding of how these methods work and the relationships between them.