Articles About Machine Learning

Visual Adversarial Imitation Learning using Variational Models (VMAIL)

This is the official implementation of the NeurIPS 2021 paper. Method VMAIL simultaneously learns a variational dynamics model and trains an on-policyadversarial imitation learning algorithm in the latent space using only model-basedrollouts. This allows for stable and sample efficient training, as well as zero-shotimitation learning by transfering the learned dynamics model Instructions Get dependencies: conda env create -f vmail.yml conda activate vmail cd robel_claw/robel pip install -e . To train agents for each environmnet download the expert data from the […]

Read more

SORA the set of rules approach

i have been thinking about this one possibility since few dayswhile everyone is talking about consciousness in machines and MLsome scientists said they want to study an entire brain neural network to be able to interpret itbut what if there is a possibility that instead of that we define a sophisticated set of rules instead?i call this the “set of rules approach” to achieve minimal consciousnessi mean to give human like consciousness(at any level) to machinewe need to train it […]

Read more

Sample Prior Guided Robust Model Learning to Suppress Noisy Labels

This repo is the official implementation of our paper “Sample Prior Guided Robust Model Learning to Suppress Noisy Labels“. Citation If you use this code/data for your research, please cite our paper “Sample Prior Guided Robust Model Learning to Suppress Noisy Labels“. @misc{chen2021sample, title={Sample Prior Guided Robust Model Learning to Suppress Noisy Labels}, author={Wenkai Chen and Chuang Zhu and Yi Chen}, year={2021}, eprint={2112.01197}, archivePrefix={arXiv}, primaryClass={cs.CV} } Training Take CIFAR-10 with 50% symmetric noise as an example: First, please modify the […]

Read more

FOREC: A Cross-Market Recommendation System

This repository provides the implementation of our CIKM 2021 paper titled as “Cross-Market Product Recommendation“. Please consider citing our paper if you find the code and XMarket dataset useful in your research. The general schema of our FOREC recommendation system is shown below. For a pair of markets, the middle part shows the market-agnostic model that we pre-train, and then fork and fine-tune for each market shown in the left and right. Note that FOREC is capable of working with […]

Read more

The successor to Budou, the machine learning powered line break organizer tool

Standalone. Small. Language-neutral. BudouX is the successor to Budou, the machine learning powered line break organizer tool. It is standalone. It works with no dependency on third-party word segmenters such as Google cloud natural language API. It is small. It takes only around 15 KB including its machine learning model. It’s reasonable to use it even on the client-side. It is language-neutral. You can train a model for any language by feeding a dataset to BudouX’s training script. Last but […]

Read more

Federated Learning with Non-IID Data

This is an implementation of the following paper: Yue Zhao, Meng Li, Liangzhen Lai, Naveen Suda, Damon Civin, Vikas Chandra.Federated Learning with Non-IID DataarXiv:1806.00582. Paper TL;DR: Previous federated optization algorithms (such as FedAvg and FedProx) converge to stationary points of a mismatched objective function due to heterogeneity in data distribution. In this paper, the authors propose a data-sharing strategy to improve training on non-IID data by creating a small subset of data which is globally shared between all the edge […]

Read more

Paddle2.x version AI-Writer

用魔改 GPT 生成网文。Tuned GPT for novel generation. 原作者github地址:https://github.com/BlinkDL/AI-Writer |–AI-Writer.gif |–AI-Writer.jpg |–convert_pytorch2paddle.py # 转换pytorch权重代码 |–LICENSE |–print_project_tree.py # 打印项目树状结构 |–README.md |–run.py # 命令行运行 |–server.jpg |–server.py # 开启服务 |–model | |–model_state.pdparams # 转换好的paddle权重文件 | |–xuanhuan-2021-10-26.json |

Read more

TLDR: Twin Learning for Dimensionality Reduction

TLDR (Twin Learning for Dimensionality Reduction) is an unsupervised dimensionality reduction method that combines neighborhood embedding learning with the simplicity and effectiveness of recent self-supervised learning losses. Inspired by manifold learning, TLDR uses nearest neighbors as a way to build pairs from a training set and a redundancy reduction loss to learn an encoder that produces representations invariant across such pairs. Similar to other neighborhood embeddings, TLDR effectively and unsupervisedly learns low-dimensional spaces where local neighborhoods of the input space […]

Read more

Dual Adaptive Sampling for Machine Learning Interatomic potential

Dual Adaptive Sampling for Machine Learning Interatomic potential. How to cite If you use this code in your research, please cite this using: Hongliang Yang, Yifan Zhu, Erting Dong, Yabei Wu, Jiong Yang, and Wenqing Zhang. Dual adaptive sampling and machine learning interatomic potentials for modeling materials with chemical bond hierarchy. Phys. Rev. B 104, 094310 (2021). Install Install pymtp You should first install the python interface for mtp: https://github.com/hlyang1992/pymtp Install das You can download the code by

Read more

Machine Learning with 5 different algorithms

In this project, the dataset was created through a survey opened on Google forms.The purpose of the form is to find the person’s favorite shopping type based on the information provided. In this context, 13 questions were asked to the user.As a result of these questions, the estimation of the shopping type, which is a classification problem, will be carried out with 5 different algorithms. These algorithms; Logistic Regression Random Forest Classifier Support Vector Machine K Neighbors Decision Tree algorithms […]

Read more
1 42 43 44 45 46 226