Home /
Research
Showing 331 - 336 / 904
Studying Taxonomy Enrichment on Diachronic WordNet Versions
IrinaNikishina, AlexanderPanchenko, VarvaraLogacheva....
Published date-11/23/2020
Ontologies, taxonomies, and thesauri are used in many NLP tasks. However, most studies are focused on the creation of these lexical resources rather than the maintenance of the existing ones. …
Federated learning with class imbalance reduction
MiaoYang, AkitanoshouWong, HongbinZhu....
Published date-11/23/2020
FederatedLearning
Federated learning (FL) is a promising technique that enables a large amount of edge computing devices to collaboratively train a global learning model. Due to privacy concerns, the raw data …
Planar 3D Transfer Learning for End to End Unimodal MRI Unbalanced Data Segmentation
MartinKolarik, RadimBurget, CarlosM.Travieso-Gonzalez....
Published date-11/23/2020
LesionSegmentation, TransferLearning
We present a novel approach of 2D to 3D transfer learning based on mapping pre-trained 2D convolutional neural network weights into planar 3D kernels. The method is validated by the …
Peeking inside the Black Box: Interpreting Deep Learning Models for Exoplanet Atmospheric Retrievals
KaiHouYip, QuentinChangeat, NikolaosNikolaou....
Published date-11/23/2020
Deep learning algorithms are growing in popularity in the field of exoplanetary science due to their ability to model highly non-linear relations and solve interesting problems in a data-driven manner. …
HistoGAN: Controlling Colors of GAN-Generated and Real Images via Color Histograms
MahmoudAfifi, MarcusA.Brubaker, MichaelS.Brown....
Published date-11/23/2020
ImageGeneration
While generative adversarial networks (GANs) can successfully produce high-quality images, they can be challenging to control. Simplifying GAN-based image generation is critical for their adoption in graphic design and artistic …
Stable Weight Decay Regularization
ZekeXie, IsseiSato, MasashiSugiyama....
Published date-11/23/2020
Weight decay is a popular regularization technique for training of deep neural networks. Modern deep learning libraries mainly use $L_{2}$ regularization as the default implementation of weight decay. \citet{loshchilov2018decoupled} demonstrated …