A Structured Self-attentive Sentence Embedding
Implementation for the paper A Structured Self-Attentive Sentence Embedding, which was published in ICLR 2017: https://arxiv.org/abs/1703.03130 . USAGE: For binary sentiment classification on imdb dataset run : python classification.py “binary” For multiclass classification on reuters dataset run : python classification.py “multiclass” You can change the model parameters in the model_params.json file Other tranining parameters like number of attention hops etc can be configured in the config.json file. If you want to use pretrained glove embeddings , set the use_embeddings parameter […]
Read more