Articles About Machine Learning

A python library that lets you customize automated machine learning

nylon An intelligent, flexible grammar of machine learning. Nylon is a python library that lets you customize automated machine learning workflows through a concise, JSON syntax. It provides a built in grammar, in which you can access different operations in ML with the english language. Installation Install latest release version: pip install -U nylon-ai Install directory from github: git clone https://github.com/Palashio/nylon.git cd nylon-ai pip install . Usage: the basics A new Polymer object should be created everytime you’re working with […]

Read more

RGB-D Local Implicit Function for Depth Completion of Transparent Objects

implicit_depth This repository maintains the official implementation of our CVPR 2021 paper: RGB-D Local Implicit Function for Depth Completion of Transparent Objects By Luyang Zhu, Arsalan Mousavian, Yu Xiang, Hammad Mazhar, Jozef van Eenbergen, Shoubhik Debnath, Dieter Fox Requirements The code has been tested on the following system: Ubuntu 18.04 Nvidia GPU (4 Tesla V100 32GB GPUs) and CUDA 10.2 python 3.7 pytorch 1.6.0 Installation Docker (Recommended) We provide a Dockerfile for building a container to run our code. More […]

Read more

Text Preprocessing in NLP with Python codes

This article was published as a part of the Data Science Blogathon Introduction Natural Language Processing (NLP) is a branch of Data Science which deals with Text data. Apart from numerical data, Text data is available to a great extent which is used to analyze and solve business problems. But before using the data for analysis or prediction, processing the data is important. To prepare the text data for the model building we perform text preprocessing. It is the very first […]

Read more

A Python library for easy manipulation and forecasting of time series

darts darts is a Python library for easy manipulation and forecasting of time series. It contains a variety of models, from classics such as ARIMA to deep neural networks. The models can all be used in the same way, using fit() and predict() functions, similar to scikit-learn. The library also makes it easy to backtest models, and combine the predictions of several models and external regressors. Darts supports both univariate and multivariate time series and models, and the neural networks […]

Read more

A Python library for Machine Learning Security

Adversarial Robustness Toolbox (ART) is a Python library for Machine Learning Security. ART provides tools that enable developers and researchers to defend and evaluate Machine Learning models and applications against the adversarial threats of Evasion, Poisoning, Extraction, and Inference. ART supports all popular machine learning frameworks (TensorFlow, Keras, PyTorch, MXNet, scikit-learn, XGBoost, LightGBM, CatBoost, GPy, etc.), all data types (images, tables, audio, video, etc.) and machine learning tasks (classification, object detection, speech recognition, generation, certification, etc.). Adversarial Threats ART for […]

Read more

Python code for Machine learning: a probabilistic perspective

pyprobml Python 3 code for my new book series Probabilistic Machine Learning. This is work in progress, so expect rough edges. Getting less rough… Jupyter notebooks For each chapter there are one or more accompanying Jupyter notebooks that cover some of the material in more detail.When you open a notebook, there will be a button at the top that says ‘Open in colab’. If you click on this, it will start a virtual machine (VM) instance on Google Cloud Platform […]

Read more

PyTorch implementation for Graph Contrastive Learning Automated

Graph Contrastive Learning Automated PyTorch implementation for Graph Contrastive Learning Automated . Yuning You, Tianlong Chen, Yang Shen, Zhangyang Wang In ICML 2021. Overview In this repository, we propose a principled framework named joint augmentation selection (JOAO), to automatically, adaptively and dynamically select augmentations during GraphCL training.Sanity check shows that the selection aligns with previous “best practices”, as shown in Figure 2. Dependencies Experiments Citation If you use this code for you research, please cite our paper. @article{you2021graph, title={Graph Contrastive […]

Read more

Self-Damaging Contrastive Learning with python

SDCLR The recent breakthrough achieved by contrastive learning accelerates the pace for deploying unsupervised training on real-world data applications. However, unlabeled data in reality is commonly imbalanced and shows a long-tail distribution, and it is unclear how robustly the latest contrastive learning methods could perform in the practical scenario. This paper proposes to explicitly tackle this challenge, via a principled framework called Self-Damaging Contrastive Learning (SDCLR), to automatically balance the representation learning without knowing the classes. Our main inspiration is […]

Read more

Self-Supervised Learning for Sketch and Handwriting

Vectorization and Rasterization: Self-Supervised Learning for Sketch and Handwriting, CVPR 2021. Ayan Kumar Bhunia, Pinaki nath Chowdhury, Yongxin Yang, Timothy Hospedales, Tao Xiang, Yi-Zhe Song, “Vectorization and Rasterization: Self-Supervised Learning for Sketch and Handwriting”, IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), 2021. Abstract Self-supervised learning has gained prominence due to its efficacy at learning powerful representations from unlabelled data that achieve excellent performance on many challenging downstream tasks. However, supervision-free pre-text tasks are challenging to design and usually […]

Read more

A GAN implemented with the Perceptual Simplicity and Spatial Constriction constraints

PS-SC GAN This repository contains the main code for training a PS-SC GAN (a GAN implemented with the Perceptual Simplicity and Spatial Constriction constraints) introduced in the paper Where and What? Examining Interpretable Disentangled Representations. The code for computing the TPL for model checkpoints from disentanglemen_lib can be found in this repository. Abstract Capturing interpretable variations has long been one of the goals indisentanglement learning. However, unlike the independence assumption,interpretability has rarely been exploited to encourage disentanglementin the unsupervised setting. […]

Read more
1 53 54 55 56 57 226