Skip to content

TensorFlow implementation of the ESIM model (Enhanced LTSM for natural language inference)

Notifications You must be signed in to change notification settings

HsiaoYetGun/ESIM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Notice

There are some problems with this version code (the mask of attention weight [Model.py, line 160-170] and the mask of mean and max [Model.py, line 220-225]), please don't use this code directly!

I’m too busy recently to follow this repo, and I will update this code in my winter vacation (starting from the 26th, Jan).

ESIM

A Tensorflow implementation of Chen-Qian's Enhanced LSTM for Natural Language Inference from ACL 2017.

Dataset

The dataset used for this task is Stanford Natural Language Inference (SNLI). Pretrained GloVe embeddings obtained from common crawl with 840B tokens used for words.

Requirements

  • Python>=3
  • NumPy
  • TensorFlow>=1.8

Usage

Download dataset from Stanford Natural Language Inference, then move snli_1.0_train.jsonl, snli_1.0_dev.jsonl, snli_1.0_test.jsonl into ./SNLI/raw data.

# move dataset to the right place
mkdir -p ./SNLI/raw\ data
mv snli_1.0_*.jsonl ./SNLI/raw\ data

Data preprocessing for convert source data into an easy-to-use format.

python3 Utils.py

Default hyper-parameters have been stored in config file in the path of ./config/config.yaml.

Training model:

python3 Train.py

Test model:

python3 Test.py

About

TensorFlow implementation of the ESIM model (Enhanced LTSM for natural language inference)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages