Home /

Research

Showing 25 - 30 / 904

Neural Pooling for Graph Neural Networks


Authors:  Anonymous....
Published date-01/01/2021
Tasks:  GraphClassification

Abstract: Tasks such as graph classification, require graph pooling to learn graph-level representations from constituent node representations. In this work, we propose two novel methods using fully connected neural network layers …

Bayesian Context Aggregation for Neural Processes


Authors:  Anonymous....
Published date-01/01/2021
Tasks:  BayesianInference, Multi-TaskLearning

Abstract: Formulating scalable probabilistic regression models with reliable uncertainty estimates has been a long-standing challenge in machine learning research. Recently, casting probabilistic regression as a multi-task learning problem in terms of …

Uniform Manifold Approximation with Two-phase Optimization


Authors:  Anonymous....
Published date-01/01/2021
Tasks:  DimensionalityReduction

Abstract: We present a dimensionality reduction algorithm called Uniform Manifold Approximation with Two-phase Optimization (UMATO), that aims to preserve both the global and local structures of high-dimensional data. Most existing dimensionality …

Ruminating Word Representations with Random Noise Masking


Authors:  Anonymous....
Published date-01/01/2021
Tasks:  TextClassification, WordEmbeddings

Abstract: We introduce a training method for better word representation and performance, which we call \textbf{GraVeR} (\textbf{Gra}dual \textbf{Ve}ctor \textbf{R}umination). The method is to gradually and iteratively add random noises and bias …

Structure and randomness in planning and reinforcement learning


Authors:  Anonymous....
Published date-01/01/2021

Abstract: Planning in large state spaces inevitably needs to balance depth and breadth of the search. It has a crucial impact on planners performance and most manage this interplay implicitly. We …

Gradient-based training of Gaussian Mixture Models for High-Dimensional Streaming Data


Authors:  Anonymous....
Published date-01/01/2021

Abstract: We present an approach for efficiently training Gaussian Mixture Models by SGD on non-stationary, high-dimensional streaming data. Our training scheme does not require data-driven parameter initialization (e.g., k-means) and has …

Filter by

Categories

Tags