Home /
Research
Showing 301 - 306 / 904
The dynamics of learning with feedback alignment
MariaRefinetti, StéphanedAscoli, RubenOhana....
Published date-11/24/2020
Direct Feedback Alignment (DFA) is emerging as an efficient and biologically plausible alternative to the ubiquitous backpropagation algorithm for training deep neural networks. Despite relying on random feedback weights for …
An end-to-end data-driven optimisation framework for constrained trajectories
FlorentDewez, BenjaminGuedj, ArthurTalpaert....
Published date-11/24/2020
Many real-world problems require to optimise trajectories under constraints. Classical approaches are based on optimal control methods but require an exact knowledge of the underlying dynamics, which could be challenging …
Lipophilicity Prediction with Multitask Learning and Molecular Substructures Representation
NinaLukashina, AlisaAlenicheva, ElizavetaVlasova....
Published date-11/24/2020
Lipophilicity is one of the factors determining the permeability of the cell membrane to a drug molecule. Hence, accurate lipophilicity prediction is an essential step in the development of new …
DeepShadows: Separating Low Surface Brightness Galaxies from Artifacts using Deep Learning
DimitriosTanoglidis, AleksandraĆiprijanović, AlexDrlica-Wagner....
Published date-11/24/2020
TransferLearning
Searches for low-surface-brightness galaxies (LSBGs) in galaxy surveys are plagued by the presence of a large number of artifacts (e.g., objects blended in the diffuse light from stars and galaxies, …
Adversarial Generation of Continuous Images
IvanSkorokhodov, SavvaIgnatyev, MohamedElhoseiny....
Published date-11/24/2020
ImageGeneration
In most existing learning systems, images are typically viewed as 2D pixel arrays. However, in another paradigm gaining popularity, a 2D image is represented as an implicit neural representation (INR) …
GLGE: A New General Language Generation Evaluation Benchmark
DayihengLiu, YuYan, YeyunGong....
Published date-11/24/2020
NaturalLanguageUnderstanding, TextGeneration, TransferLearning
Multi-task benchmarks such as GLUE and SuperGLUE have driven great progress of pretraining and transfer learning in Natural Language Processing (NLP). These benchmarks mostly focus on a range of Natural …