Home /

Research

Showing 331 - 336 / 904

Studying Taxonomy Enrichment on Diachronic WordNet Versions


Authors:  IrinaNikishina, AlexanderPanchenko, VarvaraLogacheva....
Published date-11/23/2020

Abstract: Ontologies, taxonomies, and thesauri are used in many NLP tasks. However, most studies are focused on the creation of these lexical resources rather than the maintenance of the existing ones. …

Federated learning with class imbalance reduction


Authors:  MiaoYang, AkitanoshouWong, HongbinZhu....
Published date-11/23/2020
Tasks:  FederatedLearning

Abstract: Federated learning (FL) is a promising technique that enables a large amount of edge computing devices to collaboratively train a global learning model. Due to privacy concerns, the raw data …

Planar 3D Transfer Learning for End to End Unimodal MRI Unbalanced Data Segmentation


Authors:  MartinKolarik, RadimBurget, CarlosM.Travieso-Gonzalez....
Published date-11/23/2020
Tasks:  LesionSegmentation, TransferLearning

Abstract: We present a novel approach of 2D to 3D transfer learning based on mapping pre-trained 2D convolutional neural network weights into planar 3D kernels. The method is validated by the …

Peeking inside the Black Box: Interpreting Deep Learning Models for Exoplanet Atmospheric Retrievals


Authors:  KaiHouYip, QuentinChangeat, NikolaosNikolaou....
Published date-11/23/2020

Abstract: Deep learning algorithms are growing in popularity in the field of exoplanetary science due to their ability to model highly non-linear relations and solve interesting problems in a data-driven manner. …

HistoGAN: Controlling Colors of GAN-Generated and Real Images via Color Histograms


Authors:  MahmoudAfifi, MarcusA.Brubaker, MichaelS.Brown....
Published date-11/23/2020
Tasks:  ImageGeneration

Abstract: While generative adversarial networks (GANs) can successfully produce high-quality images, they can be challenging to control. Simplifying GAN-based image generation is critical for their adoption in graphic design and artistic …

Stable Weight Decay Regularization


Authors:  ZekeXie, IsseiSato, MasashiSugiyama....
Published date-11/23/2020

Abstract: Weight decay is a popular regularization technique for training of deep neural networks. Modern deep learning libraries mainly use $L_{2}$ regularization as the default implementation of weight decay. \citet{loshchilov2018decoupled} demonstrated …

Filter by

Categories

Tags