Articles About Machine Learning

Harnessing Distribution Ratio Estimators for Learning Agents with Quality and Diversity

Quality-Diversity (QD) is a concept from Neuroevolution with some intriguing applications to Reinforcement Learning. It facilitates learning a population of agents where each member is optimized to simultaneously accumulate high task-returns and exhibit behavioral diversity compared to other members… In this paper, we build on a recent kernel-based method for training a QD policy ensemble with Stein variational gradient descent. With kernels based on $f$-divergence between the stationary distributions of policies, we convert the problem to that of efficient estimation […]

Read more

Binary Neural Network Aided CSI Feedback in Massive MIMO System

In massive multiple-input multiple-output (MIMO) system, channel state information (CSI) is essential for the base station to achieve high performance gain. Recently, deep learning is widely used in CSI compression to fight against the growing feedback overhead brought by massive MIMO in frequency division duplexing system… However, applying neural network brings extra memory and computation cost, which is non-negligible especially for the resource limited user equipment (UE). In this paper, a novel binarization aided feedback network named BCsiNet is introduced. […]

Read more

Serial Electron Diffraction Data Processing with diffractem and CrystFEL

Serial electron diffraction (SerialED) is an emerging technique, which applies the snapshot data-collection mode of serial X-ray crystallography to three-dimensional electron diffraction (3D ED), forgoing the conventional rotation method. Similarly to serial X-ray crystallography, this approach leads to almost complete absence of radiation damage effects even for the most sensitive samples, and allows for a high level of automation… However, SerialED also necessitates new techniques of data processing, which combine existing pipelines for rotation electron diffraction and serial X-ray crystallography […]

Read more

Low-Complexity Models for Acoustic Scene Classification Based on Receptive Field Regularization and Frequency Damping

Deep Neural Networks are known to be very demanding in terms of computing and memory requirements. Due to the ever increasing use of embedded systems and mobile devices with a limited resource budget, designing low-complexity models without sacrificing too much of their predictive performance gained great importance… In this work, we investigate and compare several well-known methods to reduce the number of parameters in neural networks. We further put these into the context of a recent study on the effect […]

Read more

30 Questions to test a data scientist on Natural Language Processing [Solution: Skilltest – NLP]

Introduction Humans are social animals and language is our primary tool to communicate with the society. But, what if machines could understand our language and then act accordingly? Natural Language Processing (NLP) is the science of teaching machines how to understand the language we humans speak and write. We recently launched an NLP skill test on which a total of 817 people registered. This skill test was designed to test your knowledge of Natural Language Processing. If you are one […]

Read more

The Top GitHub Repositories & Reddit Threads Every Data Scientist should know (June 2018)

Introduction Half the year has flown by and that brings us to the June edition of our popular series – the top GitHub repositories and Reddit threads from last month. During the course of writing these articles, I have learned so much about machine learning from either open source codes or invaluable discussions among the top data science brains in the world. What makes GitHub special is not just it’s code hosting and social collaboration features for data scientists. It […]

Read more

The 25 Best Data Science and Machine Learning GitHub Repositories from 2018

Introduction What’s the best platform for hosting your code, collaborating with team members, and also acts as an online resume to showcase your coding skills? Ask any data scientist, and they’ll point you towards GitHub. It has been a truly revolutionary platform in recent years and has changed the landscape of how we host and even do coding. But that’s not all. It acts as a learning tool as well. How, you ask? I’ll give you a hint – open […]

Read more

2019 In-Review and Trends for 2020 – A Technical Overview of Machine Learning and Deep Learning!

Overview A comprehensive look at the top machine learning highlights from 2019, including an exhaustive dive into NLP frameworks Check out the machine learning trends in 2020 – and hear from top experts like Sudalai Rajkumar and Dat Tran!   Introduction 2020 is almost upon us! It’s time to welcome the new year with a splash of machine learning sprinkled into our brand new resolutions. Machine learning will continue to be at the heart of what we do and how […]

Read more

Handling Imbalanced Data – Machine Learning, Computer Vision and NLP

This article was published as a part of the Data Science Blogathon. Introduction: In the real world, the data we gather will be heavily imbalanced most of the time. so, what is an Imbalanced Dataset?. The training samples are not equally distributed across the target classes.  For instance, if we take the case of the personal loan classification problem, it is effortless to get the ‘not approved’ data, in contrast to,  ‘approved’ details. As a result, the model is more […]

Read more

Conflicting Bundles: Adapting Architectures Towards the Improved Training of Deep Neural Networks

Designing neural network architectures is a challenging task and knowing which specific layers of a model must be adapted to improve the performance is almost a mystery. In this paper, we introduce a novel theory and metric to identify layers that decrease the test accuracy of the trained models, this identification is done as early as at the beginning of training… In the worst-case, such a layer could lead to a network that can not be trained at all. More […]

Read more
1 102 103 104 105 106 226