Implementation of COCO-LM, Correcting and Contrasting Text Sequences for Language Model Pretraining in Pytorch
COCO LM Pretraining (wip)
Implementation of COCO-LM, Correcting and Contrasting Text Sequences for Language Model Pretraining, in Pytorch. They were able to make contrastive learning work in a self-supervised manner for language model pretraining. Seems like a solid successor to Electra.
Install
$ pip install coco-lm-pytorch
Usage
An example using the x-transformers
library
$ pip install x-transformers
Then
import torch
from coco_lm_pytorch import COCO
# (1) instantiate the generator and discriminator, making sure that the generator is roughly a quarter to a half of