Skip to content

idirlab/graphnarrator

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

68 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Graph Narrator

Graph Narrator aims at automatically generating natural language descriptions of subgraphs in knowledge graphs. The training corpus we use is Wikipedia article, and the knowledge graph we use is processed Freebase in which reverse triples and mediator nodes are removed.

Dataset

The dataset can be downloaded from here

The Wikipedia title to Freebase entity mid mapping file can be downloaded from here

Setup

We recommend createv Conda enviroment to run the code. To create the Conda enviroment, run

./setup_environment.sh

Preprocessing

Unzip the dataset folder, run

python graphnarrator/data/generate_input_graphnarrator.py dataset_xml_folder_path

Finetuning

To fine-tune the T5 model on graph narrator dataset, run

./graphnarrator/finetune_t5.sh t5-<small/base/large> gpu_id

Example

./graphnarrator/finetune_t5.sh t5-large 0

To fine-tune the BART model on graph narrator dataset, run

./graphnarrator/finetune_bart.sh bart-<base/large> gpu_id

Example

./graphnarrator/finetune_bart.sh bart-large 1

Decoding

To decode the T5 model that has been fine-tuned on graph narrator dataset, run

./graphnarrator/test_t5.sh fine-tuned_model_path gpu_id

Example

./graphnarrator/test_t5.sh /graphnarrator/t5-large-trim/best_tfmr 2

To decode the BART model that has been fine-tuned on graph narrator dataset, run

./graphnarrator/test_bart.sh fine-tuned_model_path gpu_id

Example

./graphnarrator/test_bart.sh /graphnarrator/bart-large-trim/best_tfmr 3

Trained Models

The GNST-T5 model (T5-large model finetuned on GraphNarrative dataset with sentence trimming) can be downloaded from here

The GN-T5 model (T5-large model finetuned on GraphNarrative without sentence trimming) can be downloaded from here

The T5-large model trained on Graph Narrator without sentence trimmer and then finetuned on WebNLG dataset can be downloaded from here

The T5-large model trained on Graph Narrator with sentence trimmer and then finetuned on WebNLG dataset can be downloaded from here

Inference Documentation for T5 Model

This section provides details on how to perform inference using the T5 model (graphnarrator_webnlg.ckpt). It includes information on input format and example graphs to help guide the inference process.

Inference with the T5 Model

  • To perform inference using the T5 model, prepend the input with "translate Graph to English:".
  • The input graph should follow the format <H> <R> <T>, where:
    • <H>: Head entity
    • <R>: Relation
    • <T>: Tail entity

Example Graph for Inference

  • An example graph input for inference is:

    translate Graph to English: <H> Paris <R> is the capital of <T> France
    

Head Entity (): Paris Relation (): is the capital of Tail Entity (): France```

Additional Notes

Ensure the input format is correct for the model to generate the desired output. The head (), relation (), and tail () entities should be clearly defined. Always start the input with "translate Graph to English" to indicate the type of inference required.

Requirements

Python 3.6+ PyTorch Transformers (transformers library from Hugging Face).

from the graphnarrator folder,

./setup_environment.sh

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published