Skip to content

Commit

Permalink
added files
Browse files Browse the repository at this point in the history
  • Loading branch information
rashad101 committed Nov 1, 2022
1 parent 899af95 commit 80cd8e0
Show file tree
Hide file tree
Showing 59 changed files with 5,889,065 additions and 21 deletions.
2 changes: 1 addition & 1 deletion .gitignore
Original file line number Diff line number Diff line change
@@ -1 +1 @@
.idea/
.idea/*
18 changes: 14 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# DialoKG: Knowledge-Structure Aware Task-Oriented Dialogue Generation
PyTorch code for NAACL 2022 paper: DialoKG: Knowledge-Structure Aware Task-Oriented Dialogue Generation [[PDF]](https://aclanthology.org/2022.findings-naacl.195.pdf).
This repository contains PyTorch code for NAACL 2022 paper: DialoKG: Knowledge-Structure Aware Task-Oriented Dialogue Generation [[PDF]](https://aclanthology.org/2022.findings-naacl.195.pdf).

[![Python 3.7](https://img.shields.io/badge/python-3.7-blue.svg)](https://www.python.org/downloads/release/python-370/)
[![PyTorch](https://img.shields.io/badge/PyTorch-%23EE4C2C.svg?style=flat&logo=PyTorch&logoColor=white)](https://pytorch.org/)
Expand All @@ -14,8 +14,18 @@ chmod +x setup.sh
./setup.sh
```

### 🏋️ Training
```shell
python train.py --dataset <DATASET-NAME> --params_file config/gpt2/params.json --device cuda
```
Valid dataset names: **incar**, **camrest**, **woz2.1**

### 🎯 Evaluation
```shell

```

### Citation
### 📝 Citation
```
@inproceedings{rony-etal-2022-dialokg,
title = "{D}ialo{KG}: Knowledge-Structure Aware Task-Oriented Dialogue Generation",
Expand All @@ -32,8 +42,8 @@ chmod +x setup.sh
abstract = "Task-oriented dialogue generation is challenging since the underlying knowledge is often dynamic and effectively incorporating knowledge into the learning process is hard. It is particularly challenging to generate both human-like and informative responses in this setting. Recent research primarily focused on various knowledge distillation methods where the underlying relationship between the facts in a knowledge base is not effectively captured. In this paper, we go one step further and demonstrate how the structural information of a knowledge graph can improve the system{'}s inference capabilities. Specifically, we propose DialoKG, a novel task-oriented dialogue system that effectively incorporates knowledge into a language model. Our proposed system views relational knowledge as a knowledge graph and introduces (1) a structure-aware knowledge embedding technique, and (2) a knowledge graph-weighted attention masking strategy to facilitate the system selecting relevant information during the dialogue generation. An empirical evaluation demonstrates the effectiveness of DialoKG over state-of-the-art methods on several standard benchmark datasets.",
}
```
### License
### 📜 License
[MIT]()

### Contact
### 📪 Contact
For further information, contact the corresponding author Md Rashad Al Hasan Rony ([email](mailto:[email protected])).
8 changes: 8 additions & 0 deletions config/gpt2/generation_params.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
{
"no_sample": false,
"min_length": 1,
"max_length": 120,
"temperature": 0.18,
"top_k": 10,
"top_p": 0.9
}
19 changes: 19 additions & 0 deletions config/gpt2/params.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
{
"dataset_args": {
"history_max_utterances": 3,
"history_max_tokens": 128,
"knowledge_max_tokens": 128
},
"task": "generation",
"model_name_or_path": "gpt2",
"per_gpu_train_batch_size": 4,
"per_gpu_eval_batch_size": 4,
"gradient_accumulation_steps": 4,
"learning_rate": 6.25e-5,
"adam_epsilon": 1e-8,
"max_grad_norm": 1,
"num_train_epochs": 1,
"warmup_steps": 0,
"fp16": "",
"seed": 42
}
Loading

0 comments on commit 80cd8e0

Please sign in to comment.